A significant number of hotel bookings are called-off due to cancellations or no-shows. The typical reasons for cancellations include change of plans, scheduling conflicts, etc. This is often made easier by the option to do so free of charge or preferably at a low cost which is beneficial to hotel guests but it is a less desirable and possibly revenue-diminishing factor for hotels to deal with. Such losses are particularly high on last-minute cancellations.
The new technologies involving online booking channels have dramatically changed customers’ booking possibilities and behavior. This adds a further dimension to the challenge of how hotels handle cancellations, which are no longer limited to traditional booking and guest characteristics.
The cancellation of bookings impact a hotel on various fronts:
The increasing number of cancellations calls for a Machine Learning based solution that can help in predicting which booking is likely to be canceled. Star Hotels Group has a chain of hotels in Portugal, they are facing problems with the high number of booking cancellations and have reached out to your firm for data-driven solutions. You as a data scientist have to analyze the data provided to find which factors have a high influence on booking cancellations, build a predictive model that can predict which booking is going to be canceled in advance, and help in formulating profitable policies for cancellations and refunds.
The data contains the different attributes of customers' booking details. The detailed data dictionary is given below.
Data Dictionary
!pip install -U scikit-learn
Requirement already up-to-date: scikit-learn in d:\anaconda\lib\site-packages (0.24.2) Requirement already satisfied, skipping upgrade: scipy>=0.19.1 in d:\anaconda\lib\site-packages (from scikit-learn) (1.6.1) Requirement already satisfied, skipping upgrade: numpy>=1.13.3 in d:\anaconda\lib\site-packages (from scikit-learn) (1.19.2) Requirement already satisfied, skipping upgrade: threadpoolctl>=2.0.0 in d:\anaconda\lib\site-packages (from scikit-learn) (2.1.0) Requirement already satisfied, skipping upgrade: joblib>=0.11 in d:\anaconda\lib\site-packages (from scikit-learn) (0.17.0)
%reload_ext nb_black
import warnings
warnings.filterwarnings("ignore")
# to read data
import pandas as pd
import numpy as np
# to split data
from sklearn.model_selection import train_test_split
# visualization
import matplotlib.pyplot as plt
import seaborn as sns
# set limits for displayed rows/columns
pd.set_option("display.max_columns", None)
pd.set_option("display.max_rows", None)
# decision tree
from sklearn.tree import DecisionTreeClassifier
from sklearn import tree
# tune models
from sklearn.model_selection import GridSearchCV
# statistical analysis
import scipy.stats as stats
# metric scores to compare models to
from sklearn.metrics import (
f1_score,
accuracy_score,
recall_score,
precision_score,
roc_auc_score,
roc_curve,
confusion_matrix,
plot_confusion_matrix,
precision_recall_curve,
make_scorer,
)
import statsmodels.api as sm
# set random seed so same data is pulled in every time
np.random.seed(1)
hotel = pd.read_csv("StarHotelsGroup.csv")
# make a copy to original data is not changed
data = hotel.copy()
data.head()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | 2017 | 10 | 2 | Offline | 0 | 0 | 0 | 65.00 | 0 | Not_Canceled |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | 2018 | 11 | 6 | Online | 0 | 0 | 0 | 106.68 | 1 | Not_Canceled |
| 2 | 1 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | 2018 | 2 | 28 | Online | 0 | 0 | 0 | 60.00 | 0 | Canceled |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | 2018 | 5 | 20 | Online | 0 | 0 | 0 | 100.00 | 0 | Canceled |
| 4 | 3 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | 2019 | 7 | 13 | Online | 0 | 0 | 0 | 89.10 | 2 | Canceled |
data.tail()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 56921 | 2 | 1 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 45 | 2019 | 6 | 15 | Online | 0 | 0 | 0 | 163.88 | 1 | Not_Canceled |
| 56922 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 320 | 2019 | 5 | 15 | Offline | 0 | 0 | 0 | 90.00 | 1 | Canceled |
| 56923 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 63 | 2018 | 4 | 21 | Online | 0 | 0 | 0 | 94.50 | 0 | Canceled |
| 56924 | 2 | 0 | 2 | 2 | Not Selected | 0 | Room_Type 1 | 6 | 2019 | 4 | 28 | Online | 0 | 0 | 0 | 162.50 | 2 | Not_Canceled |
| 56925 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 207 | 2018 | 12 | 30 | Offline | 0 | 0 | 0 | 161.67 | 0 | Not_Canceled |
data.shape
(56926, 18)
data.info()
data.isnull().sum()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 56926 entries, 0 to 56925 Data columns (total 18 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 56926 non-null int64 1 no_of_children 56926 non-null int64 2 no_of_weekend_nights 56926 non-null int64 3 no_of_week_nights 56926 non-null int64 4 type_of_meal_plan 56926 non-null object 5 required_car_parking_space 56926 non-null int64 6 room_type_reserved 56926 non-null object 7 lead_time 56926 non-null int64 8 arrival_year 56926 non-null int64 9 arrival_month 56926 non-null int64 10 arrival_date 56926 non-null int64 11 market_segment_type 56926 non-null object 12 repeated_guest 56926 non-null int64 13 no_of_previous_cancellations 56926 non-null int64 14 no_of_previous_bookings_not_canceled 56926 non-null int64 15 avg_price_per_room 56926 non-null float64 16 no_of_special_requests 56926 non-null int64 17 booking_status 56926 non-null object dtypes: float64(1), int64(13), object(4) memory usage: 7.8+ MB
no_of_adults 0 no_of_children 0 no_of_weekend_nights 0 no_of_week_nights 0 type_of_meal_plan 0 required_car_parking_space 0 room_type_reserved 0 lead_time 0 arrival_year 0 arrival_month 0 arrival_date 0 market_segment_type 0 repeated_guest 0 no_of_previous_cancellations 0 no_of_previous_bookings_not_canceled 0 avg_price_per_room 0 no_of_special_requests 0 booking_status 0 dtype: int64
data.describe()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | required_car_parking_space | lead_time | arrival_year | arrival_month | arrival_date | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 56926.000000 | 56926.000000 | 56926.00000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 |
| mean | 1.875856 | 0.110723 | 0.83584 | 2.261901 | 0.026332 | 93.713909 | 2018.248340 | 6.490215 | 15.635913 | 0.024664 | 0.020939 | 0.167902 | 109.610570 | 0.666040 |
| std | 0.518667 | 0.408885 | 0.87590 | 1.432371 | 0.160123 | 92.408296 | 0.644619 | 3.027185 | 8.718717 | 0.155099 | 0.326142 | 1.943647 | 38.256075 | 0.814257 |
| min | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 | 0.000000 | 2017.000000 | 1.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
| 25% | 2.000000 | 0.000000 | 0.00000 | 1.000000 | 0.000000 | 21.000000 | 2018.000000 | 4.000000 | 8.000000 | 0.000000 | 0.000000 | 0.000000 | 85.000000 | 0.000000 |
| 50% | 2.000000 | 0.000000 | 1.00000 | 2.000000 | 0.000000 | 65.000000 | 2018.000000 | 6.000000 | 16.000000 | 0.000000 | 0.000000 | 0.000000 | 105.000000 | 0.000000 |
| 75% | 2.000000 | 0.000000 | 2.00000 | 3.000000 | 0.000000 | 142.000000 | 2019.000000 | 9.000000 | 23.000000 | 0.000000 | 0.000000 | 0.000000 | 129.700000 | 1.000000 |
| max | 4.000000 | 10.000000 | 8.00000 | 17.000000 | 1.000000 | 521.000000 | 2019.000000 | 12.000000 | 31.000000 | 1.000000 | 13.000000 | 72.000000 | 540.000000 | 5.000000 |
# data statistics with all values
data.describe(include="all")
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 56926.000000 | 56926.000000 | 56926.00000 | 56926.000000 | 56926 | 56926.000000 | 56926 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926.000000 | 56926 |
| unique | NaN | NaN | NaN | NaN | 4 | NaN | 7 | NaN | NaN | NaN | NaN | 5 | NaN | NaN | NaN | NaN | NaN | 2 |
| top | NaN | NaN | NaN | NaN | Meal Plan 1 | NaN | Room_Type 1 | NaN | NaN | NaN | NaN | Online | NaN | NaN | NaN | NaN | NaN | Not_Canceled |
| freq | NaN | NaN | NaN | NaN | 42330 | NaN | 42807 | NaN | NaN | NaN | NaN | 39490 | NaN | NaN | NaN | NaN | NaN | 35378 |
| mean | 1.875856 | 0.110723 | 0.83584 | 2.261901 | NaN | 0.026332 | NaN | 93.713909 | 2018.248340 | 6.490215 | 15.635913 | NaN | 0.024664 | 0.020939 | 0.167902 | 109.610570 | 0.666040 | NaN |
| std | 0.518667 | 0.408885 | 0.87590 | 1.432371 | NaN | 0.160123 | NaN | 92.408296 | 0.644619 | 3.027185 | 8.718717 | NaN | 0.155099 | 0.326142 | 1.943647 | 38.256075 | 0.814257 | NaN |
| min | 0.000000 | 0.000000 | 0.00000 | 0.000000 | NaN | 0.000000 | NaN | 0.000000 | 2017.000000 | 1.000000 | 1.000000 | NaN | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | NaN |
| 25% | 2.000000 | 0.000000 | 0.00000 | 1.000000 | NaN | 0.000000 | NaN | 21.000000 | 2018.000000 | 4.000000 | 8.000000 | NaN | 0.000000 | 0.000000 | 0.000000 | 85.000000 | 0.000000 | NaN |
| 50% | 2.000000 | 0.000000 | 1.00000 | 2.000000 | NaN | 0.000000 | NaN | 65.000000 | 2018.000000 | 6.000000 | 16.000000 | NaN | 0.000000 | 0.000000 | 0.000000 | 105.000000 | 0.000000 | NaN |
| 75% | 2.000000 | 0.000000 | 2.00000 | 3.000000 | NaN | 0.000000 | NaN | 142.000000 | 2019.000000 | 9.000000 | 23.000000 | NaN | 0.000000 | 0.000000 | 0.000000 | 129.700000 | 1.000000 | NaN |
| max | 4.000000 | 10.000000 | 8.00000 | 17.000000 | NaN | 1.000000 | NaN | 521.000000 | 2019.000000 | 12.000000 | 31.000000 | NaN | 1.000000 | 13.000000 | 72.000000 | 540.000000 | 5.000000 | NaN |
data[data.duplicated()].count()
no_of_adults 14350 no_of_children 14350 no_of_weekend_nights 14350 no_of_week_nights 14350 type_of_meal_plan 14350 required_car_parking_space 14350 room_type_reserved 14350 lead_time 14350 arrival_year 14350 arrival_month 14350 arrival_date 14350 market_segment_type 14350 repeated_guest 14350 no_of_previous_cancellations 14350 no_of_previous_bookings_not_canceled 14350 avg_price_per_room 14350 no_of_special_requests 14350 booking_status 14350 dtype: int64
# We will drop the duplicated values
data.drop_duplicates(inplace=True)
data.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42576 entries, 0 to 56924 Data columns (total 18 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 42576 non-null int64 1 no_of_children 42576 non-null int64 2 no_of_weekend_nights 42576 non-null int64 3 no_of_week_nights 42576 non-null int64 4 type_of_meal_plan 42576 non-null object 5 required_car_parking_space 42576 non-null int64 6 room_type_reserved 42576 non-null object 7 lead_time 42576 non-null int64 8 arrival_year 42576 non-null int64 9 arrival_month 42576 non-null int64 10 arrival_date 42576 non-null int64 11 market_segment_type 42576 non-null object 12 repeated_guest 42576 non-null int64 13 no_of_previous_cancellations 42576 non-null int64 14 no_of_previous_bookings_not_canceled 42576 non-null int64 15 avg_price_per_room 42576 non-null float64 16 no_of_special_requests 42576 non-null int64 17 booking_status 42576 non-null object dtypes: float64(1), int64(13), object(4) memory usage: 6.2+ MB
cat_columns = data.describe(include=["object"]).columns
cat_columns
cat_columns = [
"type_of_meal_plan",
"room_type_reserved",
"market_segment_type",
"booking_status",
]
for i in cat_columns:
print(data[i].value_counts())
print("*" * 50)
Meal Plan 1 31863 Not Selected 8716 Meal Plan 2 1989 Meal Plan 3 8 Name: type_of_meal_plan, dtype: int64 ************************************************** Room_Type 1 29730 Room_Type 4 9369 Room_Type 6 1540 Room_Type 5 906 Room_Type 2 718 Room_Type 7 307 Room_Type 3 6 Name: room_type_reserved, dtype: int64 ************************************************** Online 34169 Offline 5777 Corporate 1939 Complementary 496 Aviation 195 Name: market_segment_type, dtype: int64 ************************************************** Not_Canceled 28089 Canceled 14487 Name: booking_status, dtype: int64 **************************************************
Questions:
def histogram_boxplot(data, feature, figsize=(12, 7), kde=False, bins=None):
"""
Boxplot and histogram combined
data: dataframe
feature: dataframe column
figsize: size of figure (default (12,7))
kde: whether to show the density curve (default False)
bins: number of bins for histogram (default None)
"""
f2, (ax_box2, ax_hist2) = plt.subplots(
nrows=2, # Number of rows of the subplot grid= 2
sharex=True, # x-axis will be shared among all subplots
gridspec_kw={"height_ratios": (0.25, 0.75)},
figsize=figsize,
) # creating the 2 subplots
sns.boxplot(
data=data, x=feature, ax=ax_box2, showmeans=True, color="violet"
) # boxplot will be created and a star will indicate the mean value of the column
sns.histplot(
data=data, x=feature, kde=kde, ax=ax_hist2, bins=bins, palette="winter"
) if bins else sns.histplot(
data=data, x=feature, kde=kde, ax=ax_hist2
) # For histogram
ax_hist2.axvline(
data[feature].mean(), color="green", linestyle="--"
) # Add mean to the histogram
ax_hist2.axvline(
data[feature].median(), color="black", linestyle="-"
) # Add median to the histogram
histogram_boxplot(data, "no_of_adults")
histogram_boxplot(data, "no_of_children")
histogram_boxplot(data, "lead_time")
histogram_boxplot(data, "avg_price_per_room")
# function to create labeled barplots
def labeled_barplot(data, feature, perc=False, n=None):
"""
Barplot with percentage at the top
data: dataframe
feature: dataframe column
perc: whether to display percentages instead of count (default is False)
n: displays the top n category levels (default is None, i.e., display all levels)
"""
total = len(data[feature]) # length of the column
count = data[feature].nunique()
if n is None:
plt.figure(figsize=(count + 2, 6))
else:
plt.figure(figsize=(n + 2, 6))
plt.xticks(rotation=90, fontsize=15)
ax = sns.countplot(
data=data,
x=feature,
palette="Paired",
order=data[feature].value_counts().index[:n].sort_values(),
)
for p in ax.patches:
if perc == True:
label = "{:.1f}%".format(
100 * p.get_height() / total
) # percentage of each class of the category
else:
label = p.get_height() # count of each level of the category
x = p.get_x() + p.get_width() / 2 # width of the plot
y = p.get_height() # height of the plot
ax.annotate(
label,
(x, y),
ha="center",
va="center",
size=12,
xytext=(0, 5),
textcoords="offset points",
) # annotate the percentage
plt.show() # show the plot
labeled_barplot(data, "no_of_children", perc=True)
labeled_barplot(data, "no_of_adults", perc=True)
# change "arrival month" from numeric to categorical values
data["arrival_month"] = data["arrival_month"].replace([1], "January")
data["arrival_month"] = data["arrival_month"].replace([2], "February")
data["arrival_month"] = data["arrival_month"].replace([3], "March")
data["arrival_month"] = data["arrival_month"].replace([4], "April")
data["arrival_month"] = data["arrival_month"].replace([5], "May")
data["arrival_month"] = data["arrival_month"].replace([6], "June")
data["arrival_month"] = data["arrival_month"].replace([7], "July")
data["arrival_month"] = data["arrival_month"].replace([8], "August")
data["arrival_month"] = data["arrival_month"].replace([9], "September")
data["arrival_month"] = data["arrival_month"].replace([10], "October")
data["arrival_month"] = data["arrival_month"].replace([11], "November")
data["arrival_month"] = data["arrival_month"].replace([12], "December")
data["arrival_month"].value_counts()
August 5312 July 4725 May 4348 April 4227 June 4073 March 4044 October 3209 September 3057 February 2889 December 2385 November 2192 January 2115 Name: arrival_month, dtype: int64
labeled_barplot(data, "arrival_month", perc=True)
labeled_barplot(data, "arrival_year", perc=True)
labeled_barplot(data, "market_segment_type", perc=True)
labeled_barplot(data, "booking_status", perc=True)
# Change repeated guest to Yes/No for clarity
data["repeated_guest"] = data["repeated_guest"].replace([0], "No")
data["repeated_guest"] = data["repeated_guest"].replace([1], "Yes")
labeled_barplot(data, "repeated_guest", perc=True)
-96.9% of guests are new guests. Why is there such a low number of returning customers? Need to improve customer experience?
labeled_barplot(data, "type_of_meal_plan", perc=True)
labeled_barplot(data, "no_of_weekend_nights", perc=True)
labeled_barplot(data, "no_of_week_nights", perc=True)
# change 0/1 to No/Yes
data["required_car_parking_space"] = data["required_car_parking_space"].replace(
[0], "No"
)
data["required_car_parking_space"] = data["required_car_parking_space"].replace(
[1], "Yes"
)
labeled_barplot(data, "required_car_parking_space", perc=True)
labeled_barplot(data, "room_type_reserved", perc=True)
labeled_barplot(data, "no_of_special_requests", perc=True)
# Change booking status to numeric to see correlation
data2 = data.copy()
data2["booking_status"] = data2["booking_status"].replace(["Canceled"], 0)
data2["booking_status"] = data2["booking_status"].replace(["Not_Canceled"], 1)
plt.figure(figsize=(20, 7))
sns.heatmap(data2.corr(), annot=True, vmin=-1, vmax=1, fmt=".2f", cmap="Spectral")
plt.show()
data2.corr().T
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | lead_time | arrival_year | arrival_date | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| no_of_adults | 1.000000 | -0.046768 | 0.088448 | 0.114718 | 0.157586 | 0.089816 | 0.007152 | -0.082402 | -0.151376 | 0.352854 | 0.113269 | -0.122187 |
| no_of_children | -0.046768 | 1.000000 | 0.015463 | 0.022396 | 0.036515 | 0.012982 | 0.016474 | -0.021786 | -0.029038 | 0.344863 | 0.063826 | -0.082644 |
| no_of_weekend_nights | 0.088448 | 0.015463 | 1.000000 | 0.234575 | 0.116011 | 0.025955 | 0.000177 | -0.036461 | -0.048818 | 0.002365 | 0.006193 | -0.066511 |
| no_of_week_nights | 0.114718 | 0.022396 | 0.234575 | 1.000000 | 0.209997 | 0.049051 | -0.014510 | -0.039081 | -0.058228 | 0.024760 | 0.026863 | -0.134013 |
| lead_time | 0.157586 | 0.036515 | 0.116011 | 0.209997 | 1.000000 | 0.210627 | 0.036721 | -0.060561 | -0.088774 | 0.007367 | 0.024544 | -0.420148 |
| arrival_year | 0.089816 | 0.012982 | 0.025955 | 0.049051 | 0.210627 | 1.000000 | -0.003047 | -0.005479 | 0.012817 | 0.239247 | 0.034592 | -0.178416 |
| arrival_date | 0.007152 | 0.016474 | 0.000177 | -0.014510 | 0.036721 | -0.003047 | 1.000000 | -0.008540 | -0.000034 | 0.016588 | -0.001544 | -0.009968 |
| no_of_previous_cancellations | -0.082402 | -0.021786 | -0.036461 | -0.039081 | -0.060561 | -0.005479 | -0.008540 | 1.000000 | 0.582212 | -0.084619 | 0.010017 | 0.047631 |
| no_of_previous_bookings_not_canceled | -0.151376 | -0.029038 | -0.048818 | -0.058228 | -0.088774 | 0.012817 | -0.000034 | 0.582212 | 1.000000 | -0.124801 | 0.034580 | 0.070607 |
| avg_price_per_room | 0.352854 | 0.344863 | 0.002365 | 0.024760 | 0.007367 | 0.239247 | 0.016588 | -0.084619 | -0.124801 | 1.000000 | 0.128621 | -0.200509 |
| no_of_special_requests | 0.113269 | 0.063826 | 0.006193 | 0.026863 | 0.024544 | 0.034592 | -0.001544 | 0.010017 | 0.034580 | 0.128621 | 1.000000 | 0.237047 |
| booking_status | -0.122187 | -0.082644 | -0.066511 | -0.134013 | -0.420148 | -0.178416 | -0.009968 | 0.047631 | 0.070607 | -0.200509 | 0.237047 | 1.000000 |
# However we want to see correlation in graphical representation so below is function for that
def plot_corr(df, size=15):
corr = df.corr()
fig, ax = plt.subplots(figsize=(size, size))
ax.matshow(corr)
plt.xticks(range(len(corr.columns)), corr.columns)
plt.yticks(range(len(corr.columns)), corr.columns)
for (i, j), z in np.ndenumerate(corr):
ax.text(j, i, "{:0.1f}".format(z), ha="center", va="center")
plot_corr(data2)
sns.pairplot(data=data2, hue="booking_status")
plt.show()
sns.catplot(x="market_segment_type", y="lead_time", hue="booking_status", data=data)
plt.figure(figsize=(10, 5))
<Figure size 720x360 with 0 Axes>
<Figure size 720x360 with 0 Axes>
sns.barplot(x="market_segment_type", y="lead_time", hue="booking_status", data=data)
<AxesSubplot:xlabel='market_segment_type', ylabel='lead_time'>
# comparing booking status and lead time vs repeated guest
sns.barplot(x="booking_status", y="lead_time", hue="repeated_guest", data=data)
<AxesSubplot:xlabel='booking_status', ylabel='lead_time'>
# Booking status and lead time vs arrival year
sns.barplot(x="booking_status", y="lead_time", hue="arrival_year", data=data)
<AxesSubplot:xlabel='booking_status', ylabel='lead_time'>
sns.barplot(x="no_of_adults", y="avg_price_per_room", hue="booking_status", data=data)
<AxesSubplot:xlabel='no_of_adults', ylabel='avg_price_per_room'>
# room price vs lead time by cancelled
sns.scatterplot(x="avg_price_per_room", y="lead_time", data=data, hue="booking_status")
<AxesSubplot:xlabel='avg_price_per_room', ylabel='lead_time'>
plt.figure(figsize=(15, 5))
sns.barplot(
x="room_type_reserved", y="avg_price_per_room", hue="booking_status", data=data
)
<AxesSubplot:xlabel='room_type_reserved', ylabel='avg_price_per_room'>
data.groupby("market_segment_type")["avg_price_per_room"].mean()
market_segment_type Aviation 103.234256 Complementary 2.773044 Corporate 82.486086 Offline 87.675326 Online 119.891277 Name: avg_price_per_room, dtype: float64
data.groupby("market_segment_type")["avg_price_per_room"].describe()
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| market_segment_type | ||||||||
| Aviation | 195.0 | 103.234256 | 15.255131 | 79.0 | 95.00 | 95.00 | 110.0 | 193.5 |
| Complementary | 496.0 | 2.773044 | 14.523454 | 0.0 | 0.00 | 0.00 | 0.0 | 170.0 |
| Corporate | 1939.0 | 82.486086 | 24.669243 | 31.0 | 65.00 | 75.00 | 95.0 | 315.0 |
| Offline | 5777.0 | 87.675326 | 26.564144 | 12.0 | 72.25 | 83.75 | 98.0 | 540.0 |
| Online | 34169.0 | 119.891277 | 39.211545 | 0.0 | 91.63 | 115.00 | 140.4 | 510.0 |
plt.figure(figsize=(15, 5))
plt.subplot(1, 2, 1)
sns.barplot(data=data, y="avg_price_per_room", x="market_segment_type")
plt.xticks(rotation=45)
plt.subplot(1, 2, 2)
sns.boxplot(data=data, y="avg_price_per_room", x="market_segment_type")
plt.xticks(rotation=45)
plt.show()
data.groupby("booking_status")["repeated_guest"].value_counts()
booking_status repeated_guest
Canceled No 14477
Yes 10
Not_Canceled No 26784
Yes 1305
Name: repeated_guest, dtype: int64
guests = 10 / (10 + 1305) * 100
print("The percentage of repeat guests that cancel is", round(guests, 2), "percent")
The percentage of repeat guests that cancel is 0.76 percent
data.groupby("booking_status")["no_of_special_requests"].value_counts()
booking_status no_of_special_requests
Canceled 0 8752
1 4346
2 1389
Not_Canceled 1 11225
0 10476
2 4992
3 1230
4 150
5 16
Name: no_of_special_requests, dtype: int64
data.booking_status.value_counts()
Not_Canceled 28089 Canceled 14487 Name: booking_status, dtype: int64
# Booking Status vs No of special requests
sns.barplot(
x="booking_status", y="no_of_special_requests", hue="repeated_guest", data=data
)
<AxesSubplot:xlabel='booking_status', ylabel='no_of_special_requests'>
# Cancellations vs Arrival Months
plt.figure(figsize=(15, 5))
sns.countplot(x="arrival_month", data=data, hue="booking_status")
<AxesSubplot:xlabel='arrival_month', ylabel='count'>
data.groupby("booking_status")["arrival_month"].value_counts()
booking_status arrival_month
Canceled August 2475
July 2240
May 1674
April 1627
June 1584
March 1195
October 918
September 888
February 796
November 496
December 340
January 254
Not_Canceled March 2849
August 2837
May 2674
April 2600
June 2489
July 2485
October 2291
September 2169
February 2093
December 2045
January 1861
November 1696
Name: arrival_month, dtype: int64
# Do dates affect cancellations?
data.groupby("booking_status")["arrival_date"].value_counts()
booking_status arrival_date
Canceled 17 544
26 530
15 510
27 507
16 505
29 501
8 496
3 494
7 491
12 489
25 486
28 483
2 480
6 479
11 474
21 472
20 470
4 467
10 463
22 463
24 463
13 462
9 457
1 456
18 456
5 438
23 433
30 431
19 426
14 414
31 247
Not_Canceled 2 1033
19 1028
11 996
5 995
20 995
13 961
27 959
17 947
9 942
12 940
29 940
26 939
4 928
21 927
16 922
7 921
3 918
6 918
18 918
28 918
10 909
15 909
8 908
1 861
23 849
14 839
25 818
30 810
22 799
24 789
31 553
Name: arrival_date, dtype: int64
plt.figure(figsize=(15, 5))
sns.countplot(x="arrival_date", data=data, hue="booking_status")
<AxesSubplot:xlabel='arrival_date', ylabel='count'>
### Function to plot stacked bar charts for categorical columns
def stacked_plot(x):
sns.set()
## crosstab
tab1 = pd.crosstab(x, data["salary"], margins=True).sort_values(
by=" >50K", ascending=False
)
print(tab1)
print("-" * 120)
## visualising the cross tab
tab = pd.crosstab(x, data["salary"], normalize="index").sort_values(
by=" >50K", ascending=False
)
tab.plot(kind="bar", stacked=True, figsize=(17, 7))
plt.legend(
loc="lower left",
frameon=False,
)
plt.legend(loc="upper left", bbox_to_anchor=(1, 1))
plt.show()
def stacked_barplot(data, predictor, target):
"""
Print the category counts and plot a stacked bar chart
data: dataframe
predictor: independent variable
target: target variable
"""
count = data[predictor].nunique()
sorter = data[target].value_counts().index[-1]
tab1 = pd.crosstab(data[predictor], data[target], margins=True).sort_values(
by=sorter, ascending=False
)
print(tab1)
print("-" * 120)
tab = pd.crosstab(data[predictor], data[target], normalize="index").sort_values(
by=sorter, ascending=False
)
tab.plot(kind="bar", stacked=True, figsize=(count + 5, 5))
plt.legend(
loc="lower left",
frameon=False,
)
plt.legend(loc="upper left", bbox_to_anchor=(1, 1))
plt.show()
stacked_barplot(data, "room_type_reserved", "booking_status")
booking_status Canceled Not_Canceled All room_type_reserved All 14487 28089 42576 Room_Type 1 9225 20505 29730 Room_Type 4 3683 5686 9369 Room_Type 6 826 714 1540 Room_Type 5 367 539 906 Room_Type 2 274 444 718 Room_Type 7 110 197 307 Room_Type 3 2 4 6 ------------------------------------------------------------------------------------------------------------------------
data.groupby("room_type_reserved")["avg_price_per_room"].describe()
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| room_type_reserved | ||||||||
| Room_Type 1 | 29730.0 | 100.092176 | 30.690012 | 0.0 | 80.0000 | 96.300 | 119.00 | 540.00 |
| Room_Type 2 | 718.0 | 90.586657 | 35.885009 | 0.0 | 77.2500 | 86.630 | 103.05 | 284.10 |
| Room_Type 3 | 6.0 | 85.958333 | 49.623688 | 0.0 | 68.9375 | 95.375 | 125.00 | 130.00 |
| Room_Type 4 | 9369.0 | 133.247350 | 35.743346 | 0.0 | 110.0000 | 133.100 | 155.00 | 375.50 |
| Room_Type 5 | 906.0 | 158.718366 | 50.939994 | 0.0 | 125.0000 | 162.000 | 198.00 | 269.00 |
| Room_Type 6 | 1540.0 | 190.853740 | 45.094137 | 0.0 | 167.4500 | 190.000 | 220.00 | 349.63 |
| Room_Type 7 | 307.0 | 186.015212 | 94.121170 | 0.0 | 170.9450 | 211.410 | 243.90 | 352.50 |
data.isnull().sum()
no_of_adults 0 no_of_children 0 no_of_weekend_nights 0 no_of_week_nights 0 type_of_meal_plan 0 required_car_parking_space 0 room_type_reserved 0 lead_time 0 arrival_year 0 arrival_month 0 arrival_date 0 market_segment_type 0 repeated_guest 0 no_of_previous_cancellations 0 no_of_previous_bookings_not_canceled 0 avg_price_per_room 0 no_of_special_requests 0 booking_status 0 dtype: int64
numeric_columns2 = data2.select_dtypes(include=np.number).columns.tolist()
# let's plot the boxplots of all columns to check for outliers
plt.figure(figsize=(20, 30))
for i, variable in enumerate(numeric_columns2):
plt.subplot(5, 4, i + 1)
plt.boxplot(data2[variable], whis=1.5)
plt.tight_layout()
plt.title(variable)
plt.show()
# treat outliers to compare
def treat_outliers(df, col):
"""
treats outliers in a variable
col: str, name of the numerical variable
df: dataframe
col: name of the column
"""
Q1 = data2[col].quantile(0.25) # 25th quantile
Q3 = data2[col].quantile(0.75) # 75th quantile
IQR = Q3 - Q1
Lower_Whisker = Q1 - 1.5 * IQR
Upper_Whisker = Q3 + 1.5 * IQR
# all the values smaller than Lower_Whisker will be assigned the value of Lower_Whisker
# all the values greater than Upper_Whisker will be assigned the value of Upper_Whisker
data2[col] = np.clip(data2[col], Lower_Whisker, Upper_Whisker)
return df
def treat_outliers_all(df, col_list):
"""
treat outlier in all numerical variables
col_list: list of numerical variables
df: data frame
"""
for c in col_list:
df = treat_outliers(df, c)
return df
numerical_col2 = data2.select_dtypes(include=np.number).columns.tolist()
df = treat_outliers_all(data2, numerical_col2)
# let's look at the boxplots to see if the outliers have been treated or not
plt.figure(figsize=(20, 30))
for i, variable in enumerate(numeric_columns2):
plt.subplot(5, 4, i + 1)
plt.boxplot(data2[variable], whis=1.5)
plt.tight_layout()
plt.title(variable)
plt.show()
data2.describe()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | lead_time | arrival_year | arrival_date | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 42576.0 | 42576.0 | 42576.000000 | 42576.000000 | 42576.000000 | 42576.000000 | 42576.000000 | 42576.0 | 42576.0 | 42576.000000 | 42576.000000 | 42576.000000 |
| mean | 2.0 | 0.0 | 0.894354 | 2.286053 | 75.929843 | 2018.297891 | 15.682873 | 0.0 | 0.0 | 111.910108 | 0.747440 | 0.659738 |
| std | 0.0 | 0.0 | 0.882914 | 1.373238 | 72.739869 | 0.626126 | 8.813991 | 0.0 | 0.0 | 38.364236 | 0.781979 | 0.473803 |
| min | 2.0 | 0.0 | 0.000000 | 0.000000 | 0.000000 | 2017.000000 | 1.000000 | 0.0 | 0.0 | 11.250000 | 0.000000 | 0.000000 |
| 25% | 2.0 | 0.0 | 0.000000 | 1.000000 | 16.000000 | 2018.000000 | 8.000000 | 0.0 | 0.0 | 85.500000 | 0.000000 | 0.000000 |
| 50% | 2.0 | 0.0 | 1.000000 | 2.000000 | 53.000000 | 2018.000000 | 16.000000 | 0.0 | 0.0 | 107.000000 | 1.000000 | 1.000000 |
| 75% | 2.0 | 0.0 | 2.000000 | 3.000000 | 118.000000 | 2019.000000 | 23.000000 | 0.0 | 0.0 | 135.000000 | 1.000000 | 1.000000 |
| max | 2.0 | 0.0 | 5.000000 | 6.000000 | 271.000000 | 2019.000000 | 31.000000 | 0.0 | 0.0 | 209.250000 | 2.500000 | 1.000000 |
x = data2.drop(["booking_status"], axis=1)
y = data2["booking_status"]
print(x.head())
print(y.head())
no_of_adults no_of_children no_of_weekend_nights no_of_week_nights \ 0 2 0 1 2 1 2 0 2 3 2 2 0 2 1 3 2 0 0 2 4 2 0 0 3 type_of_meal_plan required_car_parking_space room_type_reserved lead_time \ 0 Meal Plan 1 No Room_Type 1 224 1 Not Selected No Room_Type 1 5 2 Meal Plan 1 No Room_Type 1 1 3 Meal Plan 1 No Room_Type 1 211 4 Not Selected No Room_Type 1 271 arrival_year arrival_month arrival_date market_segment_type \ 0 2017 October 2 Offline 1 2018 November 6 Online 2 2018 February 28 Online 3 2018 May 20 Online 4 2019 July 13 Online repeated_guest no_of_previous_cancellations \ 0 No 0 1 No 0 2 No 0 3 No 0 4 No 0 no_of_previous_bookings_not_canceled avg_price_per_room \ 0 0 65.00 1 0 106.68 2 0 60.00 3 0 100.00 4 0 89.10 no_of_special_requests 0 0.0 1 1.0 2 0.0 3 0.0 4 2.0 0 1 1 1 2 0 3 0 4 0 Name: booking_status, dtype: int64
# Creating dummy variable of non numeric left
data2.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42576 entries, 0 to 56924 Data columns (total 18 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 42576 non-null int64 1 no_of_children 42576 non-null int64 2 no_of_weekend_nights 42576 non-null int64 3 no_of_week_nights 42576 non-null int64 4 type_of_meal_plan 42576 non-null object 5 required_car_parking_space 42576 non-null object 6 room_type_reserved 42576 non-null object 7 lead_time 42576 non-null int64 8 arrival_year 42576 non-null int64 9 arrival_month 42576 non-null object 10 arrival_date 42576 non-null int64 11 market_segment_type 42576 non-null object 12 repeated_guest 42576 non-null object 13 no_of_previous_cancellations 42576 non-null int64 14 no_of_previous_bookings_not_canceled 42576 non-null int64 15 avg_price_per_room 42576 non-null float64 16 no_of_special_requests 42576 non-null float64 17 booking_status 42576 non-null int64 dtypes: float64(2), int64(10), object(6) memory usage: 7.4+ MB
# creating dummy varibles
dummy_data = pd.get_dummies(
data2,
columns=[
"type_of_meal_plan",
"required_car_parking_space",
"room_type_reserved",
"arrival_month",
"market_segment_type",
"repeated_guest",
"booking_status",
],
drop_first=True,
)
dummy_data.head()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | lead_time | arrival_year | arrival_date | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | type_of_meal_plan_Meal Plan 2 | type_of_meal_plan_Meal Plan 3 | type_of_meal_plan_Not Selected | required_car_parking_space_Yes | room_type_reserved_Room_Type 2 | room_type_reserved_Room_Type 3 | room_type_reserved_Room_Type 4 | room_type_reserved_Room_Type 5 | room_type_reserved_Room_Type 6 | room_type_reserved_Room_Type 7 | arrival_month_August | arrival_month_December | arrival_month_February | arrival_month_January | arrival_month_July | arrival_month_June | arrival_month_March | arrival_month_May | arrival_month_November | arrival_month_October | arrival_month_September | market_segment_type_Complementary | market_segment_type_Corporate | market_segment_type_Offline | market_segment_type_Online | repeated_guest_Yes | booking_status_1 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | 224 | 2017 | 2 | 0 | 0 | 65.00 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
| 1 | 2 | 0 | 2 | 3 | 5 | 2018 | 6 | 0 | 0 | 106.68 | 1.0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
| 2 | 2 | 0 | 2 | 1 | 1 | 2018 | 28 | 0 | 0 | 60.00 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| 3 | 2 | 0 | 0 | 2 | 211 | 2018 | 20 | 0 | 0 | 100.00 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| 4 | 2 | 0 | 0 | 3 | 271 | 2019 | 13 | 0 | 0 | 89.10 | 2.0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
X = dummy_data.drop("booking_status_1", axis=1) # Features
y = dummy_data["booking_status_1"]
# preparing train data in 70:30
# Splitting data into training and test set:
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=1)
print(X_train.shape, X_test.shape)
(29803, 37) (12773, 37)
print("Number of rows in train data =", X_train.shape[0])
# to build linear regression_model using statsmodels
print("Number of rows in test data =", X_test.shape[0])
Number of rows in train data = 29803 Number of rows in test data = 12773
Variance Inflation factor: Variance inflation factors measure the inflation in the variances of the regression coefficients estimates due to collinearities that exist among the predictors. It is a measure of how much the variance of the estimated regression coefficient βk is “inflated”by the existence of correlation among the predictor variables in the model.
General Rule of thumb: If VIF is 1 then there is no correlation among the kth predictor and the remaining predictor variables, and hence the variance of β̂k is not inflated at all. Whereas if VIF exceeds 5, we say there is moderate VIF and if it is 10 or exceeding 10, it shows signs of high multi-collinearity. But the purpose of the analysis should dictate which threshold to use.
from statsmodels.stats.outliers_influence import variance_inflation_factor
vif_series = pd.Series(
[variance_inflation_factor(X_train.values, i) for i in range(X_train.shape[1])],
index=X_train.columns,
dtype=float,
)
print("Series before feature selection: \n\n{}\n".format(vif_series))
Series before feature selection: no_of_adults 2.000965e+07 no_of_children NaN no_of_weekend_nights 1.061545e+00 no_of_week_nights 1.130761e+00 lead_time 1.486586e+00 arrival_year 1.935457e+00 arrival_date 1.008332e+00 no_of_previous_cancellations NaN no_of_previous_bookings_not_canceled NaN avg_price_per_room 3.089645e+00 no_of_special_requests 1.110538e+00 type_of_meal_plan_Meal Plan 2 1.108447e+00 type_of_meal_plan_Meal Plan 3 1.026861e+00 type_of_meal_plan_Not Selected 1.345420e+00 required_car_parking_space_Yes 1.041176e+00 room_type_reserved_Room_Type 2 1.035800e+00 room_type_reserved_Room_Type 3 1.001856e+00 room_type_reserved_Room_Type 4 1.373261e+00 room_type_reserved_Room_Type 5 1.120986e+00 room_type_reserved_Room_Type 6 1.410749e+00 room_type_reserved_Room_Type 7 1.108310e+00 arrival_month_August 2.099557e+00 arrival_month_December 1.650217e+00 arrival_month_February 1.704090e+00 arrival_month_January 1.562677e+00 arrival_month_July 1.954896e+00 arrival_month_June 1.816429e+00 arrival_month_March 1.876716e+00 arrival_month_May 1.879578e+00 arrival_month_November 1.576781e+00 arrival_month_October 1.860984e+00 arrival_month_September 1.855614e+00 market_segment_type_Complementary 3.874405e+00 market_segment_type_Corporate 1.067202e+01 market_segment_type_Offline 2.696437e+01 market_segment_type_Online 3.612024e+01 repeated_guest_Yes 1.586646e+00 dtype: float64
# Removing columns with low effect
x_train2 = X_train.drop(
[
"no_of_children",
"no_of_previous_cancellations",
"no_of_previous_bookings_not_canceled",
],
axis=1,
)
Predicting a customer will contribute to the revenue but in reality the customer would not have contribute to the revenue. - Loss of resources
Predicting a customer will not contribute to revenue but in reality the customer would have contributed to revenue. - Loss of opportunity
recall should be maximized, the greater the recall higher the chances of minimizing the false negatives.# defining a function to compute different metrics to check performance of a classification model built using statsmodels
def model_performance_classification_statsmodels(
model, predictors, target, threshold=0.5
):
"""
Function to compute different metrics to check classification model performance
model: classifier
predictors: independent variables
target: dependent variable
threshold: threshold for classifying the observation as class 1
"""
# checking which probabilities are greater than threshold
pred_temp = model.predict(predictors) > threshold
# rounding off the above values to get classes
pred = np.round(pred_temp)
acc = accuracy_score(target, pred) # to compute Accuracy
recall = recall_score(target, pred) # to compute Recall
precision = precision_score(target, pred) # to compute Precision
f1 = f1_score(target, pred) # to compute F1-score
# creating a dataframe of metrics
df_perf = pd.DataFrame(
{
"Accuracy": acc,
"Recall": recall,
"Precision": precision,
"F1": f1,
},
index=[0],
)
return df_perf
# There are different solvers available in Sklearn logistic regression
# The newton-cg solver is faster for high-dimensional data
from sklearn.linear_model import LogisticRegression
lg = LogisticRegression(solver="newton-cg", random_state=1)
model = lg.fit(x_train2, y_train)
print("Training performance:")
model_performance_classification_statsmodels(lg, x_train2, y_train)
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.792336 | 0.881433 | 0.818417 | 0.848757 |
# defining a function to plot the confusion_matrix of a classification model
def confusion_matrix_statsmodels(model, predictors, target, threshold=0.5):
"""
To plot the confusion_matrix with percentages
model: classifier
predictors: independent variables
target: dependent variable
threshold: threshold for classifying the observation as class 1
"""
y_pred = model.predict(predictors) > threshold
cm = confusion_matrix(target, y_pred)
labels = np.asarray(
[
["{0:0.0f}".format(item) + "\n{0:.2%}".format(item / cm.flatten().sum())]
for item in cm.flatten()
]
).reshape(2, 2)
plt.figure(figsize=(6, 4))
sns.heatmap(cm, annot=labels, fmt="")
plt.ylabel("True label")
plt.xlabel("Predicted label")
log_reg_model_train_perf = model_performance_classification_statsmodels(
lg, x_train2, y_train
)
print("Training performance:")
log_reg_model_train_perf
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.792336 | 0.881433 | 0.818417 | 0.848757 |
# creating confusion matrix
confusion_matrix_statsmodels(lg, x_train2, y_train)
False Negatives (FN): We incorrectly predicted that 7.84% or 2336 would not cancel their bookings
Overall our model accounted for 79.23% of the true values
# fitting logistic regression model
logit = sm.Logit(y_train, x_train2.astype(float))
lg2 = logit.fit(disp=False)
print(lg2.summary())
Logit Regression Results
==============================================================================
Dep. Variable: booking_status_1 No. Observations: 29803
Model: Logit Df Residuals: 29769
Method: MLE Df Model: 33
Date: Fri, 17 Sep 2021 Pseudo R-squ.: inf
Time: 12:05:47 Log-Likelihood: -inf
converged: False LL-Null: 0.0000
Covariance Type: nonrobust LLR p-value: 1.000
=====================================================================================================
coef std err z P>|z| [0.025 0.975]
-----------------------------------------------------------------------------------------------------
no_of_adults -26.9990 35.673 -0.757 0.449 -96.916 42.918
no_of_weekend_nights -0.0395 0.018 -2.200 0.028 -0.075 -0.004
no_of_week_nights -0.0498 0.012 -4.175 0.000 -0.073 -0.026
lead_time -0.0180 0.000 -60.657 0.000 -0.019 -0.017
arrival_year 0.0283 0.035 0.802 0.423 -0.041 0.098
arrival_date 0.0020 0.002 1.143 0.253 -0.001 0.006
avg_price_per_room -0.0185 0.001 -24.424 0.000 -0.020 -0.017
no_of_special_requests 1.3366 0.024 55.307 0.000 1.289 1.384
type_of_meal_plan_Meal Plan 2 0.1413 0.080 1.775 0.076 -0.015 0.297
type_of_meal_plan_Meal Plan 3 -1.1240 176.584 -0.006 0.995 -347.223 344.975
type_of_meal_plan_Not Selected -0.3833 0.044 -8.802 0.000 -0.469 -0.298
required_car_parking_space_Yes 1.5685 0.118 13.322 0.000 1.338 1.799
room_type_reserved_Room_Type 2 0.0697 0.124 0.560 0.575 -0.174 0.313
room_type_reserved_Room_Type 3 -0.2787 1.316 -0.212 0.832 -2.858 2.300
room_type_reserved_Room_Type 4 0.1699 0.044 3.870 0.000 0.084 0.256
room_type_reserved_Room_Type 5 0.3389 0.113 2.995 0.003 0.117 0.561
room_type_reserved_Room_Type 6 0.2506 0.097 2.576 0.010 0.060 0.441
room_type_reserved_Room_Type 7 0.3178 0.190 1.673 0.094 -0.055 0.690
arrival_month_August 0.0567 0.065 0.870 0.385 -0.071 0.184
arrival_month_December 1.0485 0.102 10.315 0.000 0.849 1.248
arrival_month_February -0.6425 0.078 -8.210 0.000 -0.796 -0.489
arrival_month_January 0.6497 0.107 6.094 0.000 0.441 0.859
arrival_month_July 0.1415 0.065 2.182 0.029 0.014 0.269
arrival_month_June 0.2380 0.067 3.532 0.000 0.106 0.370
arrival_month_March -0.2794 0.070 -4.016 0.000 -0.416 -0.143
arrival_month_May 0.2898 0.066 4.389 0.000 0.160 0.419
arrival_month_November -0.3129 0.090 -3.494 0.000 -0.488 -0.137
arrival_month_October 0.1468 0.081 1.817 0.069 -0.012 0.305
arrival_month_September 0.1211 0.082 1.480 0.139 -0.039 0.281
market_segment_type_Complementary 8.9760 20.832 0.431 0.667 -31.854 49.806
market_segment_type_Corporate 0.5012 0.273 1.838 0.066 -0.033 1.036
market_segment_type_Offline 2.0757 0.257 8.076 0.000 1.572 2.579
market_segment_type_Online -0.0771 0.252 -0.306 0.759 -0.570 0.416
repeated_guest_Yes 2.5784 0.441 5.843 0.000 1.713 3.443
=====================================================================================================
logit_roc_auc_train = roc_auc_score(y_train, lg2.predict(x_train2))
fpr, tpr, thresholds = roc_curve(y_train, lg2.predict(x_train2))
plt.figure(figsize=(7, 5))
plt.plot(fpr, tpr, label="Logistic Regression (area = %0.2f)" % logit_roc_auc_train)
plt.plot([0, 1], [0, 1], "r--")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel("False Positive Rate")
plt.ylabel("True Positive Rate")
plt.title("Receiver operating characteristic")
plt.legend(loc="lower right")
plt.show()
# Optimal threshold as per AUC-ROC curve
# The optimal cut off would be where tpr is high and fpr is low
fpr, tpr, thresholds = roc_curve(y_train, lg2.predict(x_train2))
optimal_idx = np.argmax(tpr - fpr)
optimal_threshold_auc_roc = thresholds[optimal_idx]
print(optimal_threshold_auc_roc)
0.6705327224785566
# creating confusion matrix
confusion_matrix_statsmodels(
lg2, x_train2, y_train, threshold=optimal_threshold_auc_roc
)
# checking model performance for this model
log_reg_model_train_perf_threshold_auc_roc = (
model_performance_classification_statsmodels(
lg2, x_train2, y_train, threshold=optimal_threshold_auc_roc
)
)
print("Training performance:")
log_reg_model_train_perf_threshold_auc_roc
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.774956 | 0.767435 | 0.876776 | 0.81847 |
y_scores = lg2.predict(x_train2)
prec, rec, tre = precision_recall_curve(
y_train,
y_scores,
)
def plot_prec_recall_vs_tresh(precisions, recalls, thresholds):
plt.plot(thresholds, precisions[:-1], "b--", label="precision")
plt.plot(thresholds, recalls[:-1], "g--", label="recall")
plt.xlabel("Threshold")
plt.legend(loc="upper left")
plt.ylim([0, 1])
plt.figure(figsize=(10, 7))
plot_prec_recall_vs_tresh(prec, rec, tre)
plt.show()
# setting the threshold
optimal_threshold_curve = 0.58
# creating confusion matrix
confusion_matrix_statsmodels(lg2, x_train2, y_train, threshold=optimal_threshold_curve)
log_reg_model_train_perf_threshold_curve = model_performance_classification_statsmodels(
lg2, x_train2, y_train, threshold=optimal_threshold_curve
)
print("Training performance:")
log_reg_model_train_perf_threshold_curve
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.789786 | 0.833875 | 0.84594 | 0.839864 |
# training performance comparison
models_train_comp_df = pd.concat(
[
log_reg_model_train_perf.T,
log_reg_model_train_perf_threshold_auc_roc.T,
log_reg_model_train_perf_threshold_curve.T,
],
axis=1,
)
models_train_comp_df.columns = [
"Logistic Regression sklearn",
"Logistic Regression-0.33 Threshold",
"Logistic Regression-0.58 Threshold",
]
print("Training performance comparison:")
models_train_comp_df
Training performance comparison:
| Logistic Regression sklearn | Logistic Regression-0.33 Threshold | Logistic Regression-0.58 Threshold | |
|---|---|---|---|
| Accuracy | 0.792336 | 0.774956 | 0.789786 |
| Recall | 0.881433 | 0.767435 | 0.833875 |
| Precision | 0.818417 | 0.876776 | 0.845940 |
| F1 | 0.848757 | 0.818470 | 0.839864 |
# Adjusting test set with same columns as training set
X_test2 = X_test[list(x_train2.columns)]
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_test2, y_test)
log_reg_model_test_perf = model_performance_classification_statsmodels(
lg2, X_test2, y_test
)
print("Test performance:")
log_reg_model_test_perf
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.793471 | 0.881602 | 0.818011 | 0.848617 |
logit_roc_auc_train = roc_auc_score(y_test, lg2.predict(X_test2))
fpr, tpr, thresholds = roc_curve(y_test, lg2.predict(X_test2))
plt.figure(figsize=(7, 5))
plt.plot(fpr, tpr, label="Logistic Regression (area = %0.2f)" % logit_roc_auc_train)
plt.plot([0, 1], [0, 1], "r--")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel("False Positive Rate")
plt.ylabel("True Positive Rate")
plt.title("Receiver operating characteristic")
plt.legend(loc="lower right")
plt.show()
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_test2, y_test, threshold=optimal_threshold_auc_roc)
# checking model performance for this model
log_reg_model_test_perf_threshold_auc_roc = (
model_performance_classification_statsmodels(
lg2, X_test2, y_test, threshold=optimal_threshold_auc_roc
)
)
print("Test performance:")
log_reg_model_test_perf_threshold_auc_roc
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.772959 | 0.762609 | 0.875565 | 0.815192 |
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_test2, y_test, threshold=optimal_threshold_curve)
log_reg_model_test_perf_threshold_curve = model_performance_classification_statsmodels(
lg2, X_test2, y_test, threshold=optimal_threshold_curve
)
print("Test performance:")
log_reg_model_test_perf_threshold_curve
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.789948 | 0.83546 | 0.843201 | 0.839312 |
# training performance comparison
models_train_comp_df = pd.concat(
[
log_reg_model_train_perf.T,
log_reg_model_train_perf_threshold_auc_roc.T,
log_reg_model_train_perf_threshold_curve.T,
],
axis=1,
)
models_train_comp_df.columns = [
"Logistic Regression sklearn",
"Logistic Regression-0.76 Threshold",
"Logistic Regression-0.58 Threshold",
]
print("Training performance comparison:")
models_train_comp_df
Training performance comparison:
| Logistic Regression sklearn | Logistic Regression-0.76 Threshold | Logistic Regression-0.58 Threshold | |
|---|---|---|---|
| Accuracy | 0.792336 | 0.774956 | 0.789786 |
| Recall | 0.881433 | 0.767435 | 0.833875 |
| Precision | 0.818417 | 0.876776 | 0.845940 |
| F1 | 0.848757 | 0.818470 | 0.839864 |
# testing performance comparison
models_test_comp_df = pd.concat(
[
log_reg_model_test_perf.T,
log_reg_model_test_perf_threshold_auc_roc.T,
log_reg_model_test_perf_threshold_curve.T,
],
axis=1,
)
models_test_comp_df.columns = [
"Logistic Regression sklearn",
"Logistic Regression-0.76 Threshold",
"Logistic Regression-0.58 Threshold",
]
print("Test set performance comparison:")
models_test_comp_df
Test set performance comparison:
| Logistic Regression sklearn | Logistic Regression-0.76 Threshold | Logistic Regression-0.58 Threshold | |
|---|---|---|---|
| Accuracy | 0.793471 | 0.772959 | 0.789948 |
| Recall | 0.881602 | 0.762609 | 0.835460 |
| Precision | 0.818011 | 0.875565 | 0.843201 |
| F1 | 0.848617 | 0.815192 | 0.839312 |
model = DecisionTreeClassifier(criterion="gini", random_state=1)
model.fit(x_train2, y_train)
DecisionTreeClassifier(random_state=1)
## Function to calculate recall score
def get_recall_score(model, predictors, target):
"""
model: classifier
predictors: independent variables
target: dependent variable
"""
prediction = model.predict(predictors)
return recall_score(target, prediction)
# Model Performance Check
def confusion_matrix_sklearn(model, predictors, target):
"""
To plot the confusion_matrix with percentages
model: classifier
predictors: independent variables
target: dependent variable
"""
y_pred = model.predict(predictors)
cm = confusion_matrix(target, y_pred)
labels = np.asarray(
[
["{0:0.0f}".format(item) + "\n{0:.2%}".format(item / cm.flatten().sum())]
for item in cm.flatten()
]
).reshape(2, 2)
plt.figure(figsize=(6, 4))
sns.heatmap(cm, annot=labels, fmt="")
plt.ylabel("True label")
plt.xlabel("Predicted label")
confusion_matrix_sklearn(model, x_train2, y_train)
decision_tree_perf_train = get_recall_score(model, x_train2, y_train)
print("Recall Score:", decision_tree_perf_train)
Recall Score: 0.9947213480864887
confusion_matrix_sklearn(model, X_test2, y_test)
decision_tree_perf_test = get_recall_score(model, X_test2, y_test)
print("Recall Score:", decision_tree_perf_test)
Recall Score: 0.8281864790747585
## creating a list of column names
feature_names = x_train2.columns.to_list()
# importance of features in the tree building ( The importance of a feature is computed as the
# (normalized) total reduction of the criterion brought by that feature. It is also known as the Gini importance )
print(
pd.DataFrame(
model.feature_importances_, columns=["Imp"], index=x_train2.columns
).sort_values(by="Imp", ascending=False)
)
Imp lead_time 0.353446 avg_price_per_room 0.162761 no_of_special_requests 0.097043 arrival_date 0.086514 market_segment_type_Online 0.078117 no_of_week_nights 0.049120 no_of_weekend_nights 0.033008 arrival_year 0.020178 arrival_month_December 0.014162 room_type_reserved_Room_Type 4 0.012134 type_of_meal_plan_Not Selected 0.009805 arrival_month_July 0.009085 arrival_month_May 0.007969 arrival_month_January 0.007887 required_car_parking_space_Yes 0.007672 arrival_month_August 0.007117 arrival_month_November 0.006446 arrival_month_June 0.006435 arrival_month_February 0.006113 arrival_month_March 0.005620 arrival_month_October 0.003959 arrival_month_September 0.003302 type_of_meal_plan_Meal Plan 2 0.003087 room_type_reserved_Room_Type 5 0.002982 room_type_reserved_Room_Type 2 0.002142 room_type_reserved_Room_Type 6 0.001186 market_segment_type_Offline 0.000765 room_type_reserved_Room_Type 7 0.000664 repeated_guest_Yes 0.000644 market_segment_type_Corporate 0.000636 market_segment_type_Complementary 0.000000 room_type_reserved_Room_Type 3 0.000000 type_of_meal_plan_Meal Plan 3 0.000000 no_of_adults 0.000000
importances = model.feature_importances_
indices = np.argsort(importances)
plt.figure(figsize=(12, 12))
plt.title("Feature Importances")
plt.barh(range(len(indices)), importances[indices], color="violet", align="center")
plt.yticks(range(len(indices)), [feature_names[i] for i in indices])
plt.xlabel("Relative Importance")
plt.show()
# Choose the type of classifier.
estimator = DecisionTreeClassifier(random_state=1, class_weight={0: 0.15, 1: 0.85})
# Grid of parameters to choose from
parameters = {
"max_depth": [np.arange(2, 50, 5), None],
"criterion": ["entropy", "gini"],
"splitter": ["best", "random"],
"min_impurity_decrease": [0.000001, 0.00001, 0.0001],
}
# Type of scoring used to compare parameter combinations
scorer = make_scorer(recall_score)
# Run the grid search
grid_obj = GridSearchCV(estimator, parameters, scoring=scorer, cv=5)
grid_obj = grid_obj.fit(x_train2, y_train)
# Set the clf to the best combination of parameters
estimator = grid_obj.best_estimator_
# Fit the best algorithm to the data.
estimator.fit(x_train2, y_train)
DecisionTreeClassifier(class_weight={0: 0.15, 1: 0.85},
min_impurity_decrease=0.0001, random_state=1)
# Performance Check on Training Data
confusion_matrix_sklearn(estimator, x_train2, y_train)
# defining a function to compute different metrics to check performance of a classification model built using sklearn
def model_performance_classification_sklearn(model, predictors, target):
"""
Function to compute different metrics to check classification model performance
model: classifier
predictors: independent variables
target: dependent variable
"""
# predicting using the independent variables
pred = model.predict(predictors)
acc = accuracy_score(target, pred) # to compute Accuracy
recall = recall_score(target, pred) # to compute Recall
precision = precision_score(target, pred) # to compute Precision
f1 = f1_score(target, pred) # to compute F1-score
# creating a dataframe of metrics
df_perf = pd.DataFrame(
{
"Accuracy": acc,
"Recall": recall,
"Precision": precision,
"F1": f1,
},
index=[0],
)
return df_perf
decision_tree_perf_train = model_performance_classification_sklearn(
estimator, x_train2, y_train
)
decision_tree_perf_train
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.79388 | 0.998782 | 0.762802 | 0.864986 |
plt.figure(figsize=(20, 20))
out = tree.plot_tree(
estimator,
feature_names=feature_names,
filled=True,
fontsize=9,
node_ids=False,
class_names=None,
)
for o in out:
arrow = o.arrow_patch
if arrow is not None:
arrow.set_edgecolor("black")
arrow.set_linewidth(1)
plt.show()
confusion_matrix_sklearn(estimator, X_test2, y_test)
decision_tree_perf_test = model_performance_classification_sklearn(
estimator, X_test2, y_test
)
decision_tree_perf_test
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.787207 | 0.997138 | 0.756353 | 0.860214 |
# importance of features in the tree building ( The importance of a feature is computed as the
# (normalized) total reduction of the 'criterion' brought by that feature. It is also known as the Gini importance )
print(
pd.DataFrame(
estimator.feature_importances_, columns=["Imp"], index=x_train2.columns
).sort_values(by="Imp", ascending=False)
)
# Here we will see that importance of features has increased
Imp lead_time 0.290504 avg_price_per_room 0.277812 no_of_special_requests 0.190463 market_segment_type_Online 0.130476 arrival_month_December 0.073796 arrival_month_January 0.010467 no_of_weekend_nights 0.007009 required_car_parking_space_Yes 0.005203 arrival_year 0.004150 no_of_week_nights 0.004037 arrival_date 0.002297 type_of_meal_plan_Not Selected 0.002074 room_type_reserved_Room_Type 6 0.001712 arrival_month_May 0.000000 arrival_month_November 0.000000 no_of_adults 0.000000 arrival_month_October 0.000000 arrival_month_September 0.000000 market_segment_type_Complementary 0.000000 market_segment_type_Corporate 0.000000 arrival_month_June 0.000000 market_segment_type_Offline 0.000000 arrival_month_March 0.000000 room_type_reserved_Room_Type 7 0.000000 arrival_month_July 0.000000 arrival_month_February 0.000000 arrival_month_August 0.000000 room_type_reserved_Room_Type 5 0.000000 room_type_reserved_Room_Type 4 0.000000 room_type_reserved_Room_Type 3 0.000000 room_type_reserved_Room_Type 2 0.000000 type_of_meal_plan_Meal Plan 3 0.000000 type_of_meal_plan_Meal Plan 2 0.000000 repeated_guest_Yes 0.000000
importances = estimator.feature_importances_
indices = np.argsort(importances)
plt.figure(figsize=(12, 12))
plt.title("Feature Importances")
plt.barh(range(len(indices)), importances[indices], color="violet", align="center")
plt.yticks(range(len(indices)), [feature_names[i] for i in indices])
plt.xlabel("Relative Importance")
plt.show()
clf = DecisionTreeClassifier(random_state=1, class_weight={0: 0.15, 1: 0.85})
path = clf.cost_complexity_pruning_path(x_train2, y_train)
ccp_alphas, impurities = path.ccp_alphas, path.impurities
pd.DataFrame(path)
| ccp_alphas | impurities | |
|---|---|---|
| 0 | 0.000000e+00 | 0.001493 |
| 1 | -2.371692e-20 | 0.001493 |
| 2 | 0.000000e+00 | 0.001493 |
| 3 | 0.000000e+00 | 0.001493 |
| 4 | 0.000000e+00 | 0.001493 |
| 5 | 5.471520e-21 | 0.001493 |
| 6 | 5.471520e-21 | 0.001493 |
| 7 | 5.471520e-21 | 0.001493 |
| 8 | 5.471520e-21 | 0.001493 |
| 9 | 5.471520e-21 | 0.001493 |
| 10 | 5.471520e-21 | 0.001493 |
| 11 | 5.471520e-21 | 0.001493 |
| 12 | 5.471520e-21 | 0.001493 |
| 13 | 5.471520e-21 | 0.001493 |
| 14 | 5.471520e-21 | 0.001493 |
| 15 | 5.471520e-21 | 0.001493 |
| 16 | 5.471520e-21 | 0.001493 |
| 17 | 5.471520e-21 | 0.001493 |
| 18 | 5.471520e-21 | 0.001493 |
| 19 | 5.471520e-21 | 0.001493 |
| 20 | 5.471520e-21 | 0.001493 |
| 21 | 5.471520e-21 | 0.001493 |
| 22 | 1.094304e-20 | 0.001493 |
| 23 | 1.094304e-20 | 0.001493 |
| 24 | 1.094304e-20 | 0.001493 |
| 25 | 1.094304e-20 | 0.001493 |
| 26 | 1.355253e-20 | 0.001493 |
| 27 | 1.355253e-20 | 0.001493 |
| 28 | 1.823840e-20 | 0.001493 |
| 29 | 1.823840e-20 | 0.001493 |
| 30 | 1.823840e-20 | 0.001493 |
| 31 | 2.462184e-20 | 0.001493 |
| 32 | 2.462184e-20 | 0.001493 |
| 33 | 2.553376e-20 | 0.001493 |
| 34 | 2.553376e-20 | 0.001493 |
| 35 | 2.553376e-20 | 0.001493 |
| 36 | 2.553376e-20 | 0.001493 |
| 37 | 2.553376e-20 | 0.001493 |
| 38 | 2.553376e-20 | 0.001493 |
| 39 | 2.553376e-20 | 0.001493 |
| 40 | 2.735760e-20 | 0.001493 |
| 41 | 3.100528e-20 | 0.001493 |
| 42 | 3.100528e-20 | 0.001493 |
| 43 | 3.100528e-20 | 0.001493 |
| 44 | 3.100528e-20 | 0.001493 |
| 45 | 3.100528e-20 | 0.001493 |
| 46 | 3.100528e-20 | 0.001493 |
| 47 | 3.100528e-20 | 0.001493 |
| 48 | 3.100528e-20 | 0.001493 |
| 49 | 3.100528e-20 | 0.001493 |
| 50 | 3.100528e-20 | 0.001493 |
| 51 | 3.100528e-20 | 0.001493 |
| 52 | 3.100528e-20 | 0.001493 |
| 53 | 3.100528e-20 | 0.001493 |
| 54 | 3.100528e-20 | 0.001493 |
| 55 | 3.100528e-20 | 0.001493 |
| 56 | 3.100528e-20 | 0.001493 |
| 57 | 3.100528e-20 | 0.001493 |
| 58 | 3.100528e-20 | 0.001493 |
| 59 | 3.100528e-20 | 0.001493 |
| 60 | 3.100528e-20 | 0.001493 |
| 61 | 3.100528e-20 | 0.001493 |
| 62 | 3.100528e-20 | 0.001493 |
| 63 | 3.100528e-20 | 0.001493 |
| 64 | 3.100528e-20 | 0.001493 |
| 65 | 3.100528e-20 | 0.001493 |
| 66 | 3.100528e-20 | 0.001493 |
| 67 | 3.100528e-20 | 0.001493 |
| 68 | 3.100528e-20 | 0.001493 |
| 69 | 3.100528e-20 | 0.001493 |
| 70 | 3.100528e-20 | 0.001493 |
| 71 | 3.100528e-20 | 0.001493 |
| 72 | 3.100528e-20 | 0.001493 |
| 73 | 3.100528e-20 | 0.001493 |
| 74 | 3.100528e-20 | 0.001493 |
| 75 | 3.100528e-20 | 0.001493 |
| 76 | 3.100528e-20 | 0.001493 |
| 77 | 3.100528e-20 | 0.001493 |
| 78 | 3.100528e-20 | 0.001493 |
| 79 | 3.100528e-20 | 0.001493 |
| 80 | 3.100528e-20 | 0.001493 |
| 81 | 3.100528e-20 | 0.001493 |
| 82 | 3.100528e-20 | 0.001493 |
| 83 | 3.100528e-20 | 0.001493 |
| 84 | 3.100528e-20 | 0.001493 |
| 85 | 3.100528e-20 | 0.001493 |
| 86 | 3.100528e-20 | 0.001493 |
| 87 | 3.100528e-20 | 0.001493 |
| 88 | 3.100528e-20 | 0.001493 |
| 89 | 3.100528e-20 | 0.001493 |
| 90 | 3.100528e-20 | 0.001493 |
| 91 | 3.100528e-20 | 0.001493 |
| 92 | 3.100528e-20 | 0.001493 |
| 93 | 3.100528e-20 | 0.001493 |
| 94 | 3.100528e-20 | 0.001493 |
| 95 | 3.100528e-20 | 0.001493 |
| 96 | 3.100528e-20 | 0.001493 |
| 97 | 3.100528e-20 | 0.001493 |
| 98 | 3.100528e-20 | 0.001493 |
| 99 | 3.100528e-20 | 0.001493 |
| 100 | 3.100528e-20 | 0.001493 |
| 101 | 3.100528e-20 | 0.001493 |
| 102 | 3.100528e-20 | 0.001493 |
| 103 | 3.100528e-20 | 0.001493 |
| 104 | 3.100528e-20 | 0.001493 |
| 105 | 3.100528e-20 | 0.001493 |
| 106 | 3.100528e-20 | 0.001493 |
| 107 | 3.100528e-20 | 0.001493 |
| 108 | 3.100528e-20 | 0.001493 |
| 109 | 3.100528e-20 | 0.001493 |
| 110 | 3.100528e-20 | 0.001493 |
| 111 | 3.100528e-20 | 0.001493 |
| 112 | 3.100528e-20 | 0.001493 |
| 113 | 3.100528e-20 | 0.001493 |
| 114 | 3.100528e-20 | 0.001493 |
| 115 | 3.100528e-20 | 0.001493 |
| 116 | 3.100528e-20 | 0.001493 |
| 117 | 3.100528e-20 | 0.001493 |
| 118 | 3.100528e-20 | 0.001493 |
| 119 | 3.100528e-20 | 0.001493 |
| 120 | 3.100528e-20 | 0.001493 |
| 121 | 3.100528e-20 | 0.001493 |
| 122 | 3.100528e-20 | 0.001493 |
| 123 | 3.100528e-20 | 0.001493 |
| 124 | 3.100528e-20 | 0.001493 |
| 125 | 3.100528e-20 | 0.001493 |
| 126 | 3.100528e-20 | 0.001493 |
| 127 | 3.100528e-20 | 0.001493 |
| 128 | 3.100528e-20 | 0.001493 |
| 129 | 3.100528e-20 | 0.001493 |
| 130 | 3.100528e-20 | 0.001493 |
| 131 | 3.100528e-20 | 0.001493 |
| 132 | 3.100528e-20 | 0.001493 |
| 133 | 3.100528e-20 | 0.001493 |
| 134 | 3.100528e-20 | 0.001493 |
| 135 | 3.100528e-20 | 0.001493 |
| 136 | 3.100528e-20 | 0.001493 |
| 137 | 3.100528e-20 | 0.001493 |
| 138 | 3.100528e-20 | 0.001493 |
| 139 | 3.100528e-20 | 0.001493 |
| 140 | 3.100528e-20 | 0.001493 |
| 141 | 3.100528e-20 | 0.001493 |
| 142 | 3.100528e-20 | 0.001493 |
| 143 | 3.100528e-20 | 0.001493 |
| 144 | 3.100528e-20 | 0.001493 |
| 145 | 3.100528e-20 | 0.001493 |
| 146 | 3.100528e-20 | 0.001493 |
| 147 | 3.100528e-20 | 0.001493 |
| 148 | 3.100528e-20 | 0.001493 |
| 149 | 3.100528e-20 | 0.001493 |
| 150 | 3.100528e-20 | 0.001493 |
| 151 | 3.100528e-20 | 0.001493 |
| 152 | 3.100528e-20 | 0.001493 |
| 153 | 3.100528e-20 | 0.001493 |
| 154 | 3.100528e-20 | 0.001493 |
| 155 | 3.100528e-20 | 0.001493 |
| 156 | 3.100528e-20 | 0.001493 |
| 157 | 3.100528e-20 | 0.001493 |
| 158 | 3.100528e-20 | 0.001493 |
| 159 | 3.100528e-20 | 0.001493 |
| 160 | 3.100528e-20 | 0.001493 |
| 161 | 3.100528e-20 | 0.001493 |
| 162 | 3.100528e-20 | 0.001493 |
| 163 | 3.100528e-20 | 0.001493 |
| 164 | 3.100528e-20 | 0.001493 |
| 165 | 3.100528e-20 | 0.001493 |
| 166 | 3.100528e-20 | 0.001493 |
| 167 | 3.100528e-20 | 0.001493 |
| 168 | 3.100528e-20 | 0.001493 |
| 169 | 3.100528e-20 | 0.001493 |
| 170 | 3.100528e-20 | 0.001493 |
| 171 | 3.100528e-20 | 0.001493 |
| 172 | 3.100528e-20 | 0.001493 |
| 173 | 3.100528e-20 | 0.001493 |
| 174 | 3.100528e-20 | 0.001493 |
| 175 | 3.100528e-20 | 0.001493 |
| 176 | 3.100528e-20 | 0.001493 |
| 177 | 3.100528e-20 | 0.001493 |
| 178 | 3.100528e-20 | 0.001493 |
| 179 | 3.100528e-20 | 0.001493 |
| 180 | 3.100528e-20 | 0.001493 |
| 181 | 3.100528e-20 | 0.001493 |
| 182 | 3.100528e-20 | 0.001493 |
| 183 | 3.100528e-20 | 0.001493 |
| 184 | 3.100528e-20 | 0.001493 |
| 185 | 3.100528e-20 | 0.001493 |
| 186 | 3.100528e-20 | 0.001493 |
| 187 | 3.100528e-20 | 0.001493 |
| 188 | 3.100528e-20 | 0.001493 |
| 189 | 3.100528e-20 | 0.001493 |
| 190 | 3.100528e-20 | 0.001493 |
| 191 | 3.100528e-20 | 0.001493 |
| 192 | 3.100528e-20 | 0.001493 |
| 193 | 3.100528e-20 | 0.001493 |
| 194 | 3.100528e-20 | 0.001493 |
| 195 | 3.100528e-20 | 0.001493 |
| 196 | 3.100528e-20 | 0.001493 |
| 197 | 3.100528e-20 | 0.001493 |
| 198 | 3.100528e-20 | 0.001493 |
| 199 | 3.100528e-20 | 0.001493 |
| 200 | 3.100528e-20 | 0.001493 |
| 201 | 3.100528e-20 | 0.001493 |
| 202 | 3.100528e-20 | 0.001493 |
| 203 | 3.100528e-20 | 0.001493 |
| 204 | 3.100528e-20 | 0.001493 |
| 205 | 3.100528e-20 | 0.001493 |
| 206 | 3.100528e-20 | 0.001493 |
| 207 | 3.100528e-20 | 0.001493 |
| 208 | 3.100528e-20 | 0.001493 |
| 209 | 3.100528e-20 | 0.001493 |
| 210 | 3.100528e-20 | 0.001493 |
| 211 | 3.100528e-20 | 0.001493 |
| 212 | 3.100528e-20 | 0.001493 |
| 213 | 3.100528e-20 | 0.001493 |
| 214 | 3.100528e-20 | 0.001493 |
| 215 | 3.100528e-20 | 0.001493 |
| 216 | 3.100528e-20 | 0.001493 |
| 217 | 3.100528e-20 | 0.001493 |
| 218 | 3.100528e-20 | 0.001493 |
| 219 | 3.100528e-20 | 0.001493 |
| 220 | 3.100528e-20 | 0.001493 |
| 221 | 3.100528e-20 | 0.001493 |
| 222 | 3.100528e-20 | 0.001493 |
| 223 | 3.100528e-20 | 0.001493 |
| 224 | 3.100528e-20 | 0.001493 |
| 225 | 3.100528e-20 | 0.001493 |
| 226 | 3.100528e-20 | 0.001493 |
| 227 | 3.100528e-20 | 0.001493 |
| 228 | 3.100528e-20 | 0.001493 |
| 229 | 3.100528e-20 | 0.001493 |
| 230 | 3.100528e-20 | 0.001493 |
| 231 | 3.100528e-20 | 0.001493 |
| 232 | 3.100528e-20 | 0.001493 |
| 233 | 3.100528e-20 | 0.001493 |
| 234 | 3.100528e-20 | 0.001493 |
| 235 | 3.100528e-20 | 0.001493 |
| 236 | 3.100528e-20 | 0.001493 |
| 237 | 3.100528e-20 | 0.001493 |
| 238 | 3.100528e-20 | 0.001493 |
| 239 | 3.100528e-20 | 0.001493 |
| 240 | 3.100528e-20 | 0.001493 |
| 241 | 3.100528e-20 | 0.001493 |
| 242 | 3.100528e-20 | 0.001493 |
| 243 | 3.100528e-20 | 0.001493 |
| 244 | 3.100528e-20 | 0.001493 |
| 245 | 3.100528e-20 | 0.001493 |
| 246 | 3.100528e-20 | 0.001493 |
| 247 | 3.100528e-20 | 0.001493 |
| 248 | 3.100528e-20 | 0.001493 |
| 249 | 3.100528e-20 | 0.001493 |
| 250 | 3.100528e-20 | 0.001493 |
| 251 | 3.100528e-20 | 0.001493 |
| 252 | 3.100528e-20 | 0.001493 |
| 253 | 3.100528e-20 | 0.001493 |
| 254 | 4.103640e-20 | 0.001493 |
| 255 | 4.134037e-20 | 0.001493 |
| 256 | 4.134037e-20 | 0.001493 |
| 257 | 4.134037e-20 | 0.001493 |
| 258 | 4.134037e-20 | 0.001493 |
| 259 | 4.134037e-20 | 0.001493 |
| 260 | 4.134037e-20 | 0.001493 |
| 261 | 4.377216e-20 | 0.001493 |
| 262 | 4.468408e-20 | 0.001493 |
| 263 | 4.468408e-20 | 0.001493 |
| 264 | 5.106752e-20 | 0.001493 |
| 265 | 5.106752e-20 | 0.001493 |
| 266 | 5.106752e-20 | 0.001493 |
| 267 | 6.018672e-20 | 0.001493 |
| 268 | 6.201056e-20 | 0.001493 |
| 269 | 6.201056e-20 | 0.001493 |
| 270 | 6.201056e-20 | 0.001493 |
| 271 | 6.201056e-20 | 0.001493 |
| 272 | 6.201056e-20 | 0.001493 |
| 273 | 6.201056e-20 | 0.001493 |
| 274 | 6.201056e-20 | 0.001493 |
| 275 | 6.201056e-20 | 0.001493 |
| 276 | 6.201056e-20 | 0.001493 |
| 277 | 6.201056e-20 | 0.001493 |
| 278 | 6.201056e-20 | 0.001493 |
| 279 | 6.201056e-20 | 0.001493 |
| 280 | 6.201056e-20 | 0.001493 |
| 281 | 6.201056e-20 | 0.001493 |
| 282 | 6.201056e-20 | 0.001493 |
| 283 | 6.201056e-20 | 0.001493 |
| 284 | 6.201056e-20 | 0.001493 |
| 285 | 6.201056e-20 | 0.001493 |
| 286 | 6.201056e-20 | 0.001493 |
| 287 | 6.201056e-20 | 0.001493 |
| 288 | 6.201056e-20 | 0.001493 |
| 289 | 6.201056e-20 | 0.001493 |
| 290 | 6.201056e-20 | 0.001493 |
| 291 | 6.201056e-20 | 0.001493 |
| 292 | 6.201056e-20 | 0.001493 |
| 293 | 6.201056e-20 | 0.001493 |
| 294 | 6.201056e-20 | 0.001493 |
| 295 | 6.201056e-20 | 0.001493 |
| 296 | 6.201056e-20 | 0.001493 |
| 297 | 6.201056e-20 | 0.001493 |
| 298 | 6.201056e-20 | 0.001493 |
| 299 | 6.201056e-20 | 0.001493 |
| 300 | 6.201056e-20 | 0.001493 |
| 301 | 6.201056e-20 | 0.001493 |
| 302 | 6.201056e-20 | 0.001493 |
| 303 | 6.201056e-20 | 0.001493 |
| 304 | 6.201056e-20 | 0.001493 |
| 305 | 6.201056e-20 | 0.001493 |
| 306 | 6.201056e-20 | 0.001493 |
| 307 | 6.201056e-20 | 0.001493 |
| 308 | 6.201056e-20 | 0.001493 |
| 309 | 6.201056e-20 | 0.001493 |
| 310 | 6.201056e-20 | 0.001493 |
| 311 | 6.201056e-20 | 0.001493 |
| 312 | 6.201056e-20 | 0.001493 |
| 313 | 6.201056e-20 | 0.001493 |
| 314 | 6.201056e-20 | 0.001493 |
| 315 | 6.201056e-20 | 0.001493 |
| 316 | 6.201056e-20 | 0.001493 |
| 317 | 6.201056e-20 | 0.001493 |
| 318 | 7.234565e-20 | 0.001493 |
| 319 | 7.234565e-20 | 0.001493 |
| 320 | 7.234565e-20 | 0.001493 |
| 321 | 7.234565e-20 | 0.001493 |
| 322 | 7.234565e-20 | 0.001493 |
| 323 | 7.234565e-20 | 0.001493 |
| 324 | 7.234565e-20 | 0.001493 |
| 325 | 7.234565e-20 | 0.001493 |
| 326 | 7.295360e-20 | 0.001493 |
| 327 | 7.295360e-20 | 0.001493 |
| 328 | 7.295360e-20 | 0.001493 |
| 329 | 7.295360e-20 | 0.001493 |
| 330 | 7.295360e-20 | 0.001493 |
| 331 | 7.295360e-20 | 0.001493 |
| 332 | 7.295360e-20 | 0.001493 |
| 333 | 7.295360e-20 | 0.001493 |
| 334 | 7.295360e-20 | 0.001493 |
| 335 | 7.295360e-20 | 0.001493 |
| 336 | 7.295360e-20 | 0.001493 |
| 337 | 7.295360e-20 | 0.001493 |
| 338 | 7.295360e-20 | 0.001493 |
| 339 | 7.295360e-20 | 0.001493 |
| 340 | 7.295360e-20 | 0.001493 |
| 341 | 7.295360e-20 | 0.001493 |
| 342 | 7.295360e-20 | 0.001493 |
| 343 | 7.295360e-20 | 0.001493 |
| 344 | 7.295360e-20 | 0.001493 |
| 345 | 7.295360e-20 | 0.001493 |
| 346 | 7.295360e-20 | 0.001493 |
| 347 | 7.295360e-20 | 0.001493 |
| 348 | 7.295360e-20 | 0.001493 |
| 349 | 7.295360e-20 | 0.001493 |
| 350 | 7.295360e-20 | 0.001493 |
| 351 | 7.295360e-20 | 0.001493 |
| 352 | 7.295360e-20 | 0.001493 |
| 353 | 7.295360e-20 | 0.001493 |
| 354 | 7.295360e-20 | 0.001493 |
| 355 | 7.295360e-20 | 0.001493 |
| 356 | 7.295360e-20 | 0.001493 |
| 357 | 7.295360e-20 | 0.001493 |
| 358 | 7.295360e-20 | 0.001493 |
| 359 | 7.295360e-20 | 0.001493 |
| 360 | 7.295360e-20 | 0.001493 |
| 361 | 7.295360e-20 | 0.001493 |
| 362 | 7.295360e-20 | 0.001493 |
| 363 | 7.295360e-20 | 0.001493 |
| 364 | 7.295360e-20 | 0.001493 |
| 365 | 7.295360e-20 | 0.001493 |
| 366 | 7.295360e-20 | 0.001493 |
| 367 | 7.295360e-20 | 0.001493 |
| 368 | 7.295360e-20 | 0.001493 |
| 369 | 7.295360e-20 | 0.001493 |
| 370 | 7.295360e-20 | 0.001493 |
| 371 | 7.295360e-20 | 0.001493 |
| 372 | 7.295360e-20 | 0.001493 |
| 373 | 7.295360e-20 | 0.001493 |
| 374 | 7.295360e-20 | 0.001493 |
| 375 | 7.295360e-20 | 0.001493 |
| 376 | 7.295360e-20 | 0.001493 |
| 377 | 7.295360e-20 | 0.001493 |
| 378 | 7.295360e-20 | 0.001493 |
| 379 | 7.295360e-20 | 0.001493 |
| 380 | 7.295360e-20 | 0.001493 |
| 381 | 7.295360e-20 | 0.001493 |
| 382 | 7.295360e-20 | 0.001493 |
| 383 | 7.295360e-20 | 0.001493 |
| 384 | 7.295360e-20 | 0.001493 |
| 385 | 7.295360e-20 | 0.001493 |
| 386 | 7.295360e-20 | 0.001493 |
| 387 | 7.842512e-20 | 0.001493 |
| 388 | 7.842512e-20 | 0.001493 |
| 389 | 7.842512e-20 | 0.001493 |
| 390 | 7.842512e-20 | 0.001493 |
| 391 | 7.842512e-20 | 0.001493 |
| 392 | 7.842512e-20 | 0.001493 |
| 393 | 8.207280e-20 | 0.001493 |
| 394 | 8.207280e-20 | 0.001493 |
| 395 | 8.572048e-20 | 0.001493 |
| 396 | 8.572048e-20 | 0.001493 |
| 397 | 8.572048e-20 | 0.001493 |
| 398 | 8.572048e-20 | 0.001493 |
| 399 | 8.572048e-20 | 0.001493 |
| 400 | 8.572048e-20 | 0.001493 |
| 401 | 8.572048e-20 | 0.001493 |
| 402 | 9.301584e-20 | 0.001493 |
| 403 | 9.301584e-20 | 0.001493 |
| 404 | 9.301584e-20 | 0.001493 |
| 405 | 9.301584e-20 | 0.001493 |
| 406 | 9.575159e-20 | 0.001493 |
| 407 | 9.575159e-20 | 0.001493 |
| 408 | 9.575159e-20 | 0.001493 |
| 409 | 9.666351e-20 | 0.001493 |
| 410 | 9.666351e-20 | 0.001493 |
| 411 | 9.666351e-20 | 0.001493 |
| 412 | 9.666351e-20 | 0.001493 |
| 413 | 9.666351e-20 | 0.001493 |
| 414 | 9.666351e-20 | 0.001493 |
| 415 | 9.666351e-20 | 0.001493 |
| 416 | 9.666351e-20 | 0.001493 |
| 417 | 9.666351e-20 | 0.001493 |
| 418 | 9.666351e-20 | 0.001493 |
| 419 | 9.666351e-20 | 0.001493 |
| 420 | 9.666351e-20 | 0.001493 |
| 421 | 9.666351e-20 | 0.001493 |
| 422 | 9.666351e-20 | 0.001493 |
| 423 | 9.666351e-20 | 0.001493 |
| 424 | 9.666351e-20 | 0.001493 |
| 425 | 9.666351e-20 | 0.001493 |
| 426 | 9.666351e-20 | 0.001493 |
| 427 | 9.666351e-20 | 0.001493 |
| 428 | 9.666351e-20 | 0.001493 |
| 429 | 9.666351e-20 | 0.001493 |
| 430 | 9.666351e-20 | 0.001493 |
| 431 | 9.666351e-20 | 0.001493 |
| 432 | 9.666351e-20 | 0.001493 |
| 433 | 9.666351e-20 | 0.001493 |
| 434 | 1.003112e-19 | 0.001493 |
| 435 | 1.021350e-19 | 0.001493 |
| 436 | 1.021350e-19 | 0.001493 |
| 437 | 1.033509e-19 | 0.001493 |
| 438 | 1.033509e-19 | 0.001493 |
| 439 | 1.033509e-19 | 0.001493 |
| 440 | 1.033509e-19 | 0.001493 |
| 441 | 1.057827e-19 | 0.001493 |
| 442 | 1.057827e-19 | 0.001493 |
| 443 | 1.094304e-19 | 0.001493 |
| 444 | 1.094304e-19 | 0.001493 |
| 445 | 1.094304e-19 | 0.001493 |
| 446 | 1.094304e-19 | 0.001493 |
| 447 | 1.094304e-19 | 0.001493 |
| 448 | 1.094304e-19 | 0.001493 |
| 449 | 1.094304e-19 | 0.001493 |
| 450 | 1.094304e-19 | 0.001493 |
| 451 | 1.094304e-19 | 0.001493 |
| 452 | 1.094304e-19 | 0.001493 |
| 453 | 1.094304e-19 | 0.001493 |
| 454 | 1.094304e-19 | 0.001493 |
| 455 | 1.094304e-19 | 0.001493 |
| 456 | 1.094304e-19 | 0.001493 |
| 457 | 1.094304e-19 | 0.001493 |
| 458 | 1.094304e-19 | 0.001493 |
| 459 | 1.094304e-19 | 0.001493 |
| 460 | 1.094304e-19 | 0.001493 |
| 461 | 1.094304e-19 | 0.001493 |
| 462 | 1.094304e-19 | 0.001493 |
| 463 | 1.094304e-19 | 0.001493 |
| 464 | 1.149019e-19 | 0.001493 |
| 465 | 1.149019e-19 | 0.001493 |
| 466 | 1.149019e-19 | 0.001493 |
| 467 | 1.203734e-19 | 0.001493 |
| 468 | 1.203734e-19 | 0.001493 |
| 469 | 1.203734e-19 | 0.001493 |
| 470 | 1.203734e-19 | 0.001493 |
| 471 | 1.203734e-19 | 0.001493 |
| 472 | 1.240211e-19 | 0.001493 |
| 473 | 1.240211e-19 | 0.001493 |
| 474 | 1.240211e-19 | 0.001493 |
| 475 | 1.240211e-19 | 0.001493 |
| 476 | 1.240211e-19 | 0.001493 |
| 477 | 1.240211e-19 | 0.001493 |
| 478 | 1.240211e-19 | 0.001493 |
| 479 | 1.240211e-19 | 0.001493 |
| 480 | 1.304046e-19 | 0.001493 |
| 481 | 1.340522e-19 | 0.001493 |
| 482 | 1.340522e-19 | 0.001493 |
| 483 | 1.340522e-19 | 0.001493 |
| 484 | 1.340522e-19 | 0.001493 |
| 485 | 1.340522e-19 | 0.001493 |
| 486 | 1.343562e-19 | 0.001493 |
| 487 | 1.446913e-19 | 0.001493 |
| 488 | 1.449953e-19 | 0.001493 |
| 489 | 1.449953e-19 | 0.001493 |
| 490 | 1.449953e-19 | 0.001493 |
| 491 | 1.449953e-19 | 0.001493 |
| 492 | 1.449953e-19 | 0.001493 |
| 493 | 1.449953e-19 | 0.001493 |
| 494 | 1.449953e-19 | 0.001493 |
| 495 | 1.449953e-19 | 0.001493 |
| 496 | 1.449953e-19 | 0.001493 |
| 497 | 1.449953e-19 | 0.001493 |
| 498 | 1.449953e-19 | 0.001493 |
| 499 | 1.449953e-19 | 0.001493 |
| 500 | 1.449953e-19 | 0.001493 |
| 501 | 1.449953e-19 | 0.001493 |
| 502 | 1.449953e-19 | 0.001493 |
| 503 | 1.449953e-19 | 0.001493 |
| 504 | 1.449953e-19 | 0.001493 |
| 505 | 1.449953e-19 | 0.001493 |
| 506 | 1.459072e-19 | 0.001493 |
| 507 | 1.459072e-19 | 0.001493 |
| 508 | 1.459072e-19 | 0.001493 |
| 509 | 1.459072e-19 | 0.001493 |
| 510 | 1.459072e-19 | 0.001493 |
| 511 | 1.459072e-19 | 0.001493 |
| 512 | 1.532026e-19 | 0.001493 |
| 513 | 1.550264e-19 | 0.001493 |
| 514 | 1.550264e-19 | 0.001493 |
| 515 | 1.550264e-19 | 0.001493 |
| 516 | 1.550264e-19 | 0.001493 |
| 517 | 1.568502e-19 | 0.001493 |
| 518 | 1.568502e-19 | 0.001493 |
| 519 | 1.601939e-19 | 0.001493 |
| 520 | 1.653615e-19 | 0.001493 |
| 521 | 1.714410e-19 | 0.001493 |
| 522 | 1.756966e-19 | 0.001493 |
| 523 | 1.756966e-19 | 0.001493 |
| 524 | 1.756966e-19 | 0.001493 |
| 525 | 1.756966e-19 | 0.001493 |
| 526 | 1.778244e-19 | 0.001493 |
| 527 | 1.808641e-19 | 0.001493 |
| 528 | 1.823840e-19 | 0.001493 |
| 529 | 1.823840e-19 | 0.001493 |
| 530 | 1.860317e-19 | 0.001493 |
| 531 | 1.860317e-19 | 0.001493 |
| 532 | 1.860317e-19 | 0.001493 |
| 533 | 1.860317e-19 | 0.001493 |
| 534 | 1.860317e-19 | 0.001493 |
| 535 | 1.860317e-19 | 0.001493 |
| 536 | 1.896793e-19 | 0.001493 |
| 537 | 1.905913e-19 | 0.001493 |
| 538 | 1.933270e-19 | 0.001493 |
| 539 | 1.933270e-19 | 0.001493 |
| 540 | 1.933270e-19 | 0.001493 |
| 541 | 1.933270e-19 | 0.001493 |
| 542 | 1.933270e-19 | 0.001493 |
| 543 | 1.933270e-19 | 0.001493 |
| 544 | 1.933270e-19 | 0.001493 |
| 545 | 1.933270e-19 | 0.001493 |
| 546 | 1.933270e-19 | 0.001493 |
| 547 | 1.960628e-19 | 0.001493 |
| 548 | 1.969747e-19 | 0.001493 |
| 549 | 2.006224e-19 | 0.001493 |
| 550 | 2.006224e-19 | 0.001493 |
| 551 | 2.015343e-19 | 0.001493 |
| 552 | 2.015343e-19 | 0.001493 |
| 553 | 2.015343e-19 | 0.001493 |
| 554 | 2.042701e-19 | 0.001493 |
| 555 | 2.067019e-19 | 0.001493 |
| 556 | 2.067019e-19 | 0.001493 |
| 557 | 2.067019e-19 | 0.001493 |
| 558 | 2.067019e-19 | 0.001493 |
| 559 | 2.067019e-19 | 0.001493 |
| 560 | 2.079177e-19 | 0.001493 |
| 561 | 2.188608e-19 | 0.001493 |
| 562 | 2.273720e-19 | 0.001493 |
| 563 | 2.325396e-19 | 0.001493 |
| 564 | 2.416588e-19 | 0.001493 |
| 565 | 2.416588e-19 | 0.001493 |
| 566 | 2.416588e-19 | 0.001493 |
| 567 | 2.416588e-19 | 0.001493 |
| 568 | 2.416588e-19 | 0.001493 |
| 569 | 2.416588e-19 | 0.001493 |
| 570 | 2.416588e-19 | 0.001493 |
| 571 | 2.425707e-19 | 0.001493 |
| 572 | 2.480422e-19 | 0.001493 |
| 573 | 2.480422e-19 | 0.001493 |
| 574 | 2.480422e-19 | 0.001493 |
| 575 | 2.480422e-19 | 0.001493 |
| 576 | 2.571614e-19 | 0.001493 |
| 577 | 2.583773e-19 | 0.001493 |
| 578 | 2.583773e-19 | 0.001493 |
| 579 | 2.790475e-19 | 0.001493 |
| 580 | 2.790475e-19 | 0.001493 |
| 581 | 2.808713e-19 | 0.001493 |
| 582 | 2.826952e-19 | 0.001493 |
| 583 | 2.893826e-19 | 0.001493 |
| 584 | 2.893826e-19 | 0.001493 |
| 585 | 2.899905e-19 | 0.001493 |
| 586 | 2.899905e-19 | 0.001493 |
| 587 | 2.899905e-19 | 0.001493 |
| 588 | 3.064051e-19 | 0.001493 |
| 589 | 3.100528e-19 | 0.001493 |
| 590 | 3.100528e-19 | 0.001493 |
| 591 | 3.127885e-19 | 0.001493 |
| 592 | 3.127885e-19 | 0.001493 |
| 593 | 3.127885e-19 | 0.001493 |
| 594 | 3.255554e-19 | 0.001493 |
| 595 | 3.255554e-19 | 0.001493 |
| 596 | 3.255554e-19 | 0.001493 |
| 597 | 3.307230e-19 | 0.001493 |
| 598 | 3.307230e-19 | 0.001493 |
| 599 | 3.307230e-19 | 0.001493 |
| 600 | 3.383223e-19 | 0.001493 |
| 601 | 3.410581e-19 | 0.001493 |
| 602 | 3.410581e-19 | 0.001493 |
| 603 | 3.410581e-19 | 0.001493 |
| 604 | 3.410581e-19 | 0.001493 |
| 605 | 3.410581e-19 | 0.001493 |
| 606 | 3.447057e-19 | 0.001493 |
| 607 | 3.565607e-19 | 0.001493 |
| 608 | 3.565607e-19 | 0.001493 |
| 609 | 3.617282e-19 | 0.001493 |
| 610 | 3.617282e-19 | 0.001493 |
| 611 | 3.617282e-19 | 0.001493 |
| 612 | 3.720633e-19 | 0.001493 |
| 613 | 3.720633e-19 | 0.001493 |
| 614 | 3.720633e-19 | 0.001493 |
| 615 | 3.811825e-19 | 0.001493 |
| 616 | 3.811825e-19 | 0.001493 |
| 617 | 3.866541e-19 | 0.001493 |
| 618 | 3.875660e-19 | 0.001493 |
| 619 | 3.875660e-19 | 0.001493 |
| 620 | 3.927335e-19 | 0.001493 |
| 621 | 3.927335e-19 | 0.001493 |
| 622 | 3.927335e-19 | 0.001493 |
| 623 | 3.927335e-19 | 0.001493 |
| 624 | 4.030686e-19 | 0.001493 |
| 625 | 4.030686e-19 | 0.001493 |
| 626 | 4.134037e-19 | 0.001493 |
| 627 | 4.134037e-19 | 0.001493 |
| 628 | 4.134037e-19 | 0.001493 |
| 629 | 4.392414e-19 | 0.001493 |
| 630 | 4.392414e-19 | 0.001493 |
| 631 | 4.392414e-19 | 0.001493 |
| 632 | 4.431931e-19 | 0.001493 |
| 633 | 4.431931e-19 | 0.001493 |
| 634 | 4.495765e-19 | 0.001493 |
| 635 | 4.805818e-19 | 0.001493 |
| 636 | 4.960845e-19 | 0.001493 |
| 637 | 4.960845e-19 | 0.001493 |
| 638 | 5.015560e-19 | 0.001493 |
| 639 | 5.270897e-19 | 0.001493 |
| 640 | 5.416805e-19 | 0.001493 |
| 641 | 5.416805e-19 | 0.001493 |
| 642 | 5.799811e-19 | 0.001493 |
| 643 | 5.994354e-19 | 0.001493 |
| 644 | 6.356082e-19 | 0.001493 |
| 645 | 6.407758e-19 | 0.001493 |
| 646 | 6.821161e-19 | 0.001493 |
| 647 | 7.441267e-19 | 0.001493 |
| 648 | 7.441267e-19 | 0.001493 |
| 649 | 7.906346e-19 | 0.001493 |
| 650 | 8.061372e-19 | 0.001493 |
| 651 | 8.371425e-19 | 0.001493 |
| 652 | 8.371425e-19 | 0.001493 |
| 653 | 8.784829e-19 | 0.001493 |
| 654 | 9.043206e-19 | 0.001493 |
| 655 | 9.611636e-19 | 0.001493 |
| 656 | 9.921689e-19 | 0.001493 |
| 657 | 1.012839e-18 | 0.001493 |
| 658 | 1.012839e-18 | 0.001493 |
| 659 | 1.136860e-18 | 0.001493 |
| 660 | 1.167258e-18 | 0.001493 |
| 661 | 1.167258e-18 | 0.001493 |
| 662 | 1.167258e-18 | 0.001493 |
| 663 | 1.167258e-18 | 0.001493 |
| 664 | 1.167258e-18 | 0.001493 |
| 665 | 1.167258e-18 | 0.001493 |
| 666 | 1.167258e-18 | 0.001493 |
| 667 | 1.167258e-18 | 0.001493 |
| 668 | 1.167258e-18 | 0.001493 |
| 669 | 1.263921e-18 | 0.001493 |
| 670 | 1.333227e-18 | 0.001493 |
| 671 | 1.529594e-18 | 0.001493 |
| 672 | 1.895882e-18 | 0.001493 |
| 673 | 2.333603e-18 | 0.001493 |
| 674 | 2.334515e-18 | 0.001493 |
| 675 | 3.115119e-18 | 0.001493 |
| 676 | 3.115119e-18 | 0.001493 |
| 677 | 3.591445e-18 | 0.001493 |
| 678 | 3.889339e-18 | 0.001493 |
| 679 | 3.889339e-18 | 0.001493 |
| 680 | 4.085401e-18 | 0.001493 |
| 681 | 4.470232e-18 | 0.001493 |
| 682 | 4.906129e-18 | 0.001493 |
| 683 | 5.193384e-18 | 0.001493 |
| 684 | 5.953013e-18 | 0.001493 |
| 685 | 6.230237e-18 | 0.001493 |
| 686 | 6.230237e-18 | 0.001493 |
| 687 | 1.011228e-17 | 0.001493 |
| 688 | 2.485590e-17 | 0.001493 |
| 689 | 3.426083e-17 | 0.001493 |
| 690 | 4.046736e-17 | 0.001493 |
| 691 | 4.589420e-17 | 0.001493 |
| 692 | 4.233977e-16 | 0.001493 |
| 693 | 4.193254e-07 | 0.001493 |
| 694 | 5.580950e-07 | 0.001494 |
| 695 | 7.160788e-07 | 0.001496 |
| 696 | 7.349229e-07 | 0.001497 |
| 697 | 7.420237e-07 | 0.001500 |
| 698 | 8.850128e-07 | 0.001501 |
| 699 | 9.520593e-07 | 0.001503 |
| 700 | 9.973954e-07 | 0.001505 |
| 701 | 1.054799e-06 | 0.001507 |
| 702 | 1.074118e-06 | 0.001509 |
| 703 | 1.074118e-06 | 0.001512 |
| 704 | 1.074118e-06 | 0.001514 |
| 705 | 1.074118e-06 | 0.001516 |
| 706 | 1.089640e-06 | 0.001518 |
| 707 | 1.089640e-06 | 0.001520 |
| 708 | 1.129830e-06 | 0.001523 |
| 709 | 1.132179e-06 | 0.001524 |
| 710 | 1.132179e-06 | 0.001525 |
| 711 | 1.132179e-06 | 0.001526 |
| 712 | 1.132179e-06 | 0.001527 |
| 713 | 1.132179e-06 | 0.001528 |
| 714 | 1.132179e-06 | 0.001529 |
| 715 | 1.132179e-06 | 0.001530 |
| 716 | 1.132179e-06 | 0.001532 |
| 717 | 1.132179e-06 | 0.001533 |
| 718 | 1.132179e-06 | 0.001534 |
| 719 | 1.132179e-06 | 0.001535 |
| 720 | 1.132179e-06 | 0.001536 |
| 721 | 1.132179e-06 | 0.001537 |
| 722 | 1.132179e-06 | 0.001538 |
| 723 | 1.132179e-06 | 0.001540 |
| 724 | 1.132179e-06 | 0.001541 |
| 725 | 1.132179e-06 | 0.001542 |
| 726 | 1.132179e-06 | 0.001543 |
| 727 | 1.132179e-06 | 0.001544 |
| 728 | 1.132179e-06 | 0.001545 |
| 729 | 1.142471e-06 | 0.001547 |
| 730 | 1.215257e-06 | 0.001550 |
| 731 | 1.302299e-06 | 0.001554 |
| 732 | 1.469846e-06 | 0.001555 |
| 733 | 1.469846e-06 | 0.001557 |
| 734 | 1.469846e-06 | 0.001558 |
| 735 | 1.551504e-06 | 0.001560 |
| 736 | 1.551504e-06 | 0.001561 |
| 737 | 1.551504e-06 | 0.001563 |
| 738 | 1.551504e-06 | 0.001564 |
| 739 | 1.551504e-06 | 0.001566 |
| 740 | 1.551504e-06 | 0.001568 |
| 741 | 1.551504e-06 | 0.001569 |
| 742 | 1.551504e-06 | 0.001571 |
| 743 | 1.551504e-06 | 0.001572 |
| 744 | 1.551504e-06 | 0.001574 |
| 745 | 1.551504e-06 | 0.001575 |
| 746 | 1.551504e-06 | 0.001577 |
| 747 | 1.551504e-06 | 0.001578 |
| 748 | 1.675624e-06 | 0.001582 |
| 749 | 1.770026e-06 | 0.001583 |
| 750 | 1.770026e-06 | 0.001585 |
| 751 | 1.770026e-06 | 0.001587 |
| 752 | 1.770026e-06 | 0.001589 |
| 753 | 1.770026e-06 | 0.001591 |
| 754 | 1.770026e-06 | 0.001592 |
| 755 | 1.994791e-06 | 0.001594 |
| 756 | 1.994791e-06 | 0.001596 |
| 757 | 2.060194e-06 | 0.001598 |
| 758 | 2.179280e-06 | 0.001601 |
| 759 | 3.320560e-06 | 0.001617 |
| 760 | 3.642662e-06 | 0.001621 |
| 761 | 4.018284e-06 | 0.001637 |
| 762 | 4.059167e-06 | 0.001653 |
| 763 | 4.083676e-06 | 0.001669 |
| 764 | 4.085716e-06 | 0.001686 |
| 765 | 4.105249e-06 | 0.001702 |
| 766 | 4.290119e-06 | 0.001719 |
| 767 | 4.520981e-06 | 0.001737 |
| 768 | 4.654512e-06 | 0.001756 |
| 769 | 4.664229e-06 | 0.001789 |
| 770 | 4.673987e-06 | 0.001821 |
| 771 | 4.688611e-06 | 0.001854 |
| 772 | 4.762609e-06 | 0.001888 |
| 773 | 4.978687e-06 | 0.001937 |
| 774 | 5.289218e-06 | 0.001953 |
| 775 | 5.341243e-06 | 0.001969 |
| 776 | 5.341243e-06 | 0.001985 |
| 777 | 5.341243e-06 | 0.002001 |
| 778 | 5.357712e-06 | 0.002017 |
| 779 | 5.357712e-06 | 0.002033 |
| 780 | 5.364522e-06 | 0.002066 |
| 781 | 5.370591e-06 | 0.002082 |
| 782 | 5.370591e-06 | 0.002098 |
| 783 | 5.370591e-06 | 0.002114 |
| 784 | 5.380939e-06 | 0.002130 |
| 785 | 5.380939e-06 | 0.002146 |
| 786 | 5.380939e-06 | 0.002162 |
| 787 | 5.389435e-06 | 0.002179 |
| 788 | 5.389435e-06 | 0.002195 |
| 789 | 5.389435e-06 | 0.002211 |
| 790 | 5.389435e-06 | 0.002227 |
| 791 | 5.389435e-06 | 0.002243 |
| 792 | 5.396536e-06 | 0.002259 |
| 793 | 5.396536e-06 | 0.002276 |
| 794 | 5.396536e-06 | 0.002292 |
| 795 | 5.402559e-06 | 0.002308 |
| 796 | 5.402559e-06 | 0.002324 |
| 797 | 5.403388e-06 | 0.002340 |
| 798 | 5.416159e-06 | 0.002357 |
| 799 | 5.416159e-06 | 0.002373 |
| 800 | 5.419637e-06 | 0.002389 |
| 801 | 5.419637e-06 | 0.002405 |
| 802 | 5.419637e-06 | 0.002422 |
| 803 | 5.422732e-06 | 0.002438 |
| 804 | 5.422732e-06 | 0.002454 |
| 805 | 5.428002e-06 | 0.002471 |
| 806 | 5.428002e-06 | 0.002487 |
| 807 | 5.428002e-06 | 0.002503 |
| 808 | 5.430264e-06 | 0.002519 |
| 809 | 5.430264e-06 | 0.002536 |
| 810 | 5.432322e-06 | 0.002552 |
| 811 | 5.437514e-06 | 0.002568 |
| 812 | 5.438980e-06 | 0.002585 |
| 813 | 5.440339e-06 | 0.002601 |
| 814 | 5.441601e-06 | 0.002617 |
| 815 | 5.442776e-06 | 0.002634 |
| 816 | 5.442776e-06 | 0.002650 |
| 817 | 5.444901e-06 | 0.002666 |
| 818 | 5.446769e-06 | 0.002683 |
| 819 | 5.446769e-06 | 0.002699 |
| 820 | 5.449185e-06 | 0.002715 |
| 821 | 5.449185e-06 | 0.002732 |
| 822 | 5.450248e-06 | 0.002764 |
| 823 | 5.450584e-06 | 0.002781 |
| 824 | 5.451230e-06 | 0.002797 |
| 825 | 5.452098e-06 | 0.002813 |
| 826 | 5.452985e-06 | 0.002830 |
| 827 | 5.452985e-06 | 0.002846 |
| 828 | 5.453184e-06 | 0.002879 |
| 829 | 5.453515e-06 | 0.002895 |
| 830 | 5.453515e-06 | 0.002912 |
| 831 | 5.454969e-06 | 0.002928 |
| 832 | 5.455838e-06 | 0.002944 |
| 833 | 5.457376e-06 | 0.002961 |
| 834 | 5.457376e-06 | 0.002977 |
| 835 | 5.458996e-06 | 0.002993 |
| 836 | 5.459567e-06 | 0.003010 |
| 837 | 5.460101e-06 | 0.003026 |
| 838 | 5.461927e-06 | 0.003043 |
| 839 | 5.462691e-06 | 0.003059 |
| 840 | 5.463042e-06 | 0.003075 |
| 841 | 5.463042e-06 | 0.003092 |
| 842 | 5.463692e-06 | 0.003108 |
| 843 | 5.465298e-06 | 0.003125 |
| 844 | 5.466250e-06 | 0.003141 |
| 845 | 5.467045e-06 | 0.003157 |
| 846 | 5.468560e-06 | 0.003190 |
| 847 | 5.471536e-06 | 0.003207 |
| 848 | 5.471624e-06 | 0.003223 |
| 849 | 5.475570e-06 | 0.003239 |
| 850 | 5.502293e-06 | 0.003256 |
| 851 | 5.509233e-06 | 0.003272 |
| 852 | 5.512414e-06 | 0.003306 |
| 853 | 5.599092e-06 | 0.003322 |
| 854 | 5.633437e-06 | 0.003339 |
| 855 | 5.650582e-06 | 0.003356 |
| 856 | 5.688848e-06 | 0.003373 |
| 857 | 5.884006e-06 | 0.003391 |
| 858 | 6.062887e-06 | 0.003439 |
| 859 | 6.133341e-06 | 0.003488 |
| 860 | 6.173979e-06 | 0.003556 |
| 861 | 6.178340e-06 | 0.003575 |
| 862 | 6.206016e-06 | 0.003606 |
| 863 | 6.293425e-06 | 0.003637 |
| 864 | 6.415679e-06 | 0.003650 |
| 865 | 6.429254e-06 | 0.003682 |
| 866 | 6.433658e-06 | 0.003715 |
| 867 | 6.437427e-06 | 0.003747 |
| 868 | 6.451239e-06 | 0.003779 |
| 869 | 6.451353e-06 | 0.003811 |
| 870 | 6.464022e-06 | 0.003844 |
| 871 | 6.475034e-06 | 0.003876 |
| 872 | 6.486288e-06 | 0.003908 |
| 873 | 6.514993e-06 | 0.003941 |
| 874 | 6.547495e-06 | 0.003974 |
| 875 | 6.547734e-06 | 0.004006 |
| 876 | 6.656085e-06 | 0.004026 |
| 877 | 6.668418e-06 | 0.004060 |
| 878 | 6.813193e-06 | 0.004107 |
| 879 | 6.852673e-06 | 0.004190 |
| 880 | 6.970808e-06 | 0.004238 |
| 881 | 6.981768e-06 | 0.004252 |
| 882 | 6.981768e-06 | 0.004266 |
| 883 | 6.981768e-06 | 0.004280 |
| 884 | 6.993571e-06 | 0.004329 |
| 885 | 7.015233e-06 | 0.004378 |
| 886 | 7.072440e-06 | 0.004428 |
| 887 | 7.107862e-06 | 0.004478 |
| 888 | 7.135873e-06 | 0.004542 |
| 889 | 7.260477e-06 | 0.004622 |
| 890 | 7.332346e-06 | 0.004702 |
| 891 | 7.338195e-06 | 0.004717 |
| 892 | 7.349229e-06 | 0.004746 |
| 893 | 7.349229e-06 | 0.004776 |
| 894 | 7.364805e-06 | 0.004857 |
| 895 | 7.389793e-06 | 0.004886 |
| 896 | 7.421958e-06 | 0.004968 |
| 897 | 7.461168e-06 | 0.005199 |
| 898 | 7.506686e-06 | 0.005297 |
| 899 | 7.547857e-06 | 0.005312 |
| 900 | 7.547857e-06 | 0.005327 |
| 901 | 7.547857e-06 | 0.005342 |
| 902 | 7.547857e-06 | 0.005357 |
| 903 | 7.547857e-06 | 0.005372 |
| 904 | 7.547857e-06 | 0.005388 |
| 905 | 7.547857e-06 | 0.005403 |
| 906 | 7.547857e-06 | 0.005418 |
| 907 | 7.547857e-06 | 0.005433 |
| 908 | 7.547857e-06 | 0.005448 |
| 909 | 7.547857e-06 | 0.005463 |
| 910 | 7.547857e-06 | 0.005478 |
| 911 | 7.547857e-06 | 0.005493 |
| 912 | 7.547857e-06 | 0.005508 |
| 913 | 7.547857e-06 | 0.005523 |
| 914 | 7.547857e-06 | 0.005539 |
| 915 | 7.547857e-06 | 0.005554 |
| 916 | 7.547857e-06 | 0.005569 |
| 917 | 7.547857e-06 | 0.005584 |
| 918 | 7.547857e-06 | 0.005599 |
| 919 | 7.547857e-06 | 0.005614 |
| 920 | 7.547857e-06 | 0.005629 |
| 921 | 7.547857e-06 | 0.005644 |
| 922 | 7.547857e-06 | 0.005659 |
| 923 | 7.547857e-06 | 0.005674 |
| 924 | 7.547857e-06 | 0.005689 |
| 925 | 7.547857e-06 | 0.005705 |
| 926 | 7.547857e-06 | 0.005720 |
| 927 | 7.547857e-06 | 0.005735 |
| 928 | 7.547857e-06 | 0.005750 |
| 929 | 7.548226e-06 | 0.005863 |
| 930 | 7.585039e-06 | 0.005878 |
| 931 | 7.672273e-06 | 0.005909 |
| 932 | 7.716691e-06 | 0.005924 |
| 933 | 7.716691e-06 | 0.005940 |
| 934 | 7.716691e-06 | 0.005955 |
| 935 | 7.721697e-06 | 0.005971 |
| 936 | 7.736734e-06 | 0.006002 |
| 937 | 7.757520e-06 | 0.006017 |
| 938 | 7.757520e-06 | 0.006033 |
| 939 | 7.757520e-06 | 0.006048 |
| 940 | 7.757520e-06 | 0.006064 |
| 941 | 7.757520e-06 | 0.006079 |
| 942 | 7.757520e-06 | 0.006095 |
| 943 | 7.757520e-06 | 0.006110 |
| 944 | 7.757520e-06 | 0.006126 |
| 945 | 7.757520e-06 | 0.006141 |
| 946 | 7.757520e-06 | 0.006157 |
| 947 | 7.757520e-06 | 0.006172 |
| 948 | 7.757520e-06 | 0.006203 |
| 949 | 7.757520e-06 | 0.006219 |
| 950 | 7.757520e-06 | 0.006234 |
| 951 | 7.757520e-06 | 0.006250 |
| 952 | 7.757520e-06 | 0.006265 |
| 953 | 7.757520e-06 | 0.006281 |
| 954 | 7.757520e-06 | 0.006296 |
| 955 | 7.757520e-06 | 0.006312 |
| 956 | 7.757520e-06 | 0.006327 |
| 957 | 7.757520e-06 | 0.006343 |
| 958 | 7.757520e-06 | 0.006358 |
| 959 | 7.757520e-06 | 0.006374 |
| 960 | 7.757520e-06 | 0.006389 |
| 961 | 7.757520e-06 | 0.006405 |
| 962 | 7.757520e-06 | 0.006421 |
| 963 | 7.757520e-06 | 0.006436 |
| 964 | 7.757520e-06 | 0.006452 |
| 965 | 7.757520e-06 | 0.006467 |
| 966 | 7.757520e-06 | 0.006483 |
| 967 | 7.757520e-06 | 0.006498 |
| 968 | 7.757520e-06 | 0.006514 |
| 969 | 7.757520e-06 | 0.006529 |
| 970 | 7.757520e-06 | 0.006545 |
| 971 | 7.757520e-06 | 0.006560 |
| 972 | 7.757520e-06 | 0.006576 |
| 973 | 7.757520e-06 | 0.006591 |
| 974 | 7.757520e-06 | 0.006607 |
| 975 | 7.757520e-06 | 0.006622 |
| 976 | 7.757520e-06 | 0.006638 |
| 977 | 7.757520e-06 | 0.006653 |
| 978 | 7.757520e-06 | 0.006669 |
| 979 | 7.757520e-06 | 0.006684 |
| 980 | 7.757520e-06 | 0.006700 |
| 981 | 7.764280e-06 | 0.006715 |
| 982 | 7.771209e-06 | 0.006731 |
| 983 | 7.786936e-06 | 0.006746 |
| 984 | 7.796688e-06 | 0.006762 |
| 985 | 7.796688e-06 | 0.006778 |
| 986 | 7.796688e-06 | 0.006793 |
| 987 | 7.796688e-06 | 0.006809 |
| 988 | 7.797819e-06 | 0.006824 |
| 989 | 7.811768e-06 | 0.006840 |
| 990 | 7.819580e-06 | 0.006871 |
| 991 | 7.821748e-06 | 0.006903 |
| 992 | 7.845576e-06 | 0.006918 |
| 993 | 7.845576e-06 | 0.006934 |
| 994 | 7.845576e-06 | 0.006950 |
| 995 | 7.862351e-06 | 0.006981 |
| 996 | 7.866781e-06 | 0.006997 |
| 997 | 7.866781e-06 | 0.007013 |
| 998 | 7.866781e-06 | 0.007028 |
| 999 | 7.866781e-06 | 0.007044 |
| 1000 | 7.866781e-06 | 0.007060 |
| 1001 | 7.866781e-06 | 0.007076 |
| 1002 | 7.866781e-06 | 0.007091 |
| 1003 | 7.866781e-06 | 0.007107 |
| 1004 | 7.866781e-06 | 0.007123 |
| 1005 | 7.866781e-06 | 0.007138 |
| 1006 | 7.866781e-06 | 0.007154 |
| 1007 | 7.866781e-06 | 0.007170 |
| 1008 | 7.866781e-06 | 0.007186 |
| 1009 | 7.866781e-06 | 0.007201 |
| 1010 | 7.866781e-06 | 0.007217 |
| 1011 | 7.866781e-06 | 0.007233 |
| 1012 | 7.866781e-06 | 0.007264 |
| 1013 | 7.866781e-06 | 0.007280 |
| 1014 | 7.866781e-06 | 0.007312 |
| 1015 | 7.866781e-06 | 0.007327 |
| 1016 | 7.866781e-06 | 0.007343 |
| 1017 | 7.866781e-06 | 0.007359 |
| 1018 | 7.866781e-06 | 0.007374 |
| 1019 | 7.866781e-06 | 0.007390 |
| 1020 | 7.866781e-06 | 0.007422 |
| 1021 | 7.866781e-06 | 0.007437 |
| 1022 | 7.866781e-06 | 0.007453 |
| 1023 | 7.866781e-06 | 0.007469 |
| 1024 | 7.866781e-06 | 0.007485 |
| 1025 | 7.866781e-06 | 0.007516 |
| 1026 | 7.866781e-06 | 0.007532 |
| 1027 | 7.866781e-06 | 0.007563 |
| 1028 | 7.866781e-06 | 0.007579 |
| 1029 | 7.866781e-06 | 0.007610 |
| 1030 | 7.866781e-06 | 0.007626 |
| 1031 | 7.866781e-06 | 0.007642 |
| 1032 | 7.866781e-06 | 0.007658 |
| 1033 | 7.866781e-06 | 0.007673 |
| 1034 | 7.866781e-06 | 0.007689 |
| 1035 | 7.866781e-06 | 0.007705 |
| 1036 | 7.874174e-06 | 0.007721 |
| 1037 | 7.881640e-06 | 0.007736 |
| 1038 | 7.881640e-06 | 0.007752 |
| 1039 | 7.898674e-06 | 0.007768 |
| 1040 | 7.898674e-06 | 0.007784 |
| 1041 | 7.919892e-06 | 0.007800 |
| 1042 | 7.927359e-06 | 0.007831 |
| 1043 | 7.927359e-06 | 0.007863 |
| 1044 | 7.928926e-06 | 0.007879 |
| 1045 | 7.933827e-06 | 0.007895 |
| 1046 | 7.933827e-06 | 0.007911 |
| 1047 | 7.933827e-06 | 0.007926 |
| 1048 | 7.933827e-06 | 0.007942 |
| 1049 | 7.933827e-06 | 0.007958 |
| 1050 | 7.933827e-06 | 0.007974 |
| 1051 | 7.933827e-06 | 0.007990 |
| 1052 | 7.933827e-06 | 0.008006 |
| 1053 | 7.933827e-06 | 0.008022 |
| 1054 | 7.933827e-06 | 0.008038 |
| 1055 | 7.933827e-06 | 0.008053 |
| 1056 | 7.933827e-06 | 0.008069 |
| 1057 | 7.933827e-06 | 0.008085 |
| 1058 | 7.933827e-06 | 0.008101 |
| 1059 | 7.933827e-06 | 0.008117 |
| 1060 | 7.933827e-06 | 0.008133 |
| 1061 | 7.933827e-06 | 0.008149 |
| 1062 | 7.933827e-06 | 0.008164 |
| 1063 | 7.933827e-06 | 0.008180 |
| 1064 | 7.933827e-06 | 0.008196 |
| 1065 | 7.933827e-06 | 0.008228 |
| 1066 | 7.937230e-06 | 0.008276 |
| 1067 | 7.938444e-06 | 0.008291 |
| 1068 | 7.946848e-06 | 0.008355 |
| 1069 | 7.950768e-06 | 0.008403 |
| 1070 | 7.958492e-06 | 0.008435 |
| 1071 | 7.958492e-06 | 0.008466 |
| 1072 | 7.958492e-06 | 0.008498 |
| 1073 | 7.958492e-06 | 0.008530 |
| 1074 | 7.958492e-06 | 0.008562 |
| 1075 | 7.958492e-06 | 0.008594 |
| 1076 | 7.961146e-06 | 0.008641 |
| 1077 | 7.965776e-06 | 0.008689 |
| 1078 | 7.967183e-06 | 0.008705 |
| 1079 | 7.976042e-06 | 0.008721 |
| 1080 | 7.976042e-06 | 0.008737 |
| 1081 | 7.976042e-06 | 0.008753 |
| 1082 | 7.976042e-06 | 0.008769 |
| 1083 | 7.979163e-06 | 0.008785 |
| 1084 | 7.979163e-06 | 0.008801 |
| 1085 | 7.979163e-06 | 0.008817 |
| 1086 | 7.979163e-06 | 0.008833 |
| 1087 | 7.979163e-06 | 0.008849 |
| 1088 | 7.979163e-06 | 0.008865 |
| 1089 | 7.979163e-06 | 0.008881 |
| 1090 | 7.979163e-06 | 0.008897 |
| 1091 | 7.979163e-06 | 0.008913 |
| 1092 | 7.979163e-06 | 0.008929 |
| 1093 | 7.979163e-06 | 0.008945 |
| 1094 | 7.979163e-06 | 0.008961 |
| 1095 | 7.979163e-06 | 0.008976 |
| 1096 | 7.979163e-06 | 0.008992 |
| 1097 | 7.979163e-06 | 0.009008 |
| 1098 | 7.979163e-06 | 0.009024 |
| 1099 | 7.979163e-06 | 0.009040 |
| 1100 | 7.979163e-06 | 0.009056 |
| 1101 | 7.979163e-06 | 0.009072 |
| 1102 | 7.979163e-06 | 0.009088 |
| 1103 | 7.979163e-06 | 0.009104 |
| 1104 | 7.979163e-06 | 0.009120 |
| 1105 | 7.979163e-06 | 0.009136 |
| 1106 | 7.979163e-06 | 0.009152 |
| 1107 | 7.979163e-06 | 0.009168 |
| 1108 | 7.979163e-06 | 0.009184 |
| 1109 | 7.979163e-06 | 0.009216 |
| 1110 | 7.979163e-06 | 0.009232 |
| 1111 | 7.979163e-06 | 0.009248 |
| 1112 | 7.979163e-06 | 0.009264 |
| 1113 | 7.979163e-06 | 0.009280 |
| 1114 | 7.979163e-06 | 0.009296 |
| 1115 | 7.979163e-06 | 0.009312 |
| 1116 | 7.979163e-06 | 0.009328 |
| 1117 | 7.982316e-06 | 0.009344 |
| 1118 | 7.990946e-06 | 0.009360 |
| 1119 | 7.991180e-06 | 0.009407 |
| 1120 | 7.991180e-06 | 0.009455 |
| 1121 | 7.992761e-06 | 0.009487 |
| 1122 | 8.000126e-06 | 0.009503 |
| 1123 | 8.000874e-06 | 0.009519 |
| 1124 | 8.000874e-06 | 0.009535 |
| 1125 | 8.011865e-06 | 0.009567 |
| 1126 | 8.011865e-06 | 0.009632 |
| 1127 | 8.011865e-06 | 0.009648 |
| 1128 | 8.011865e-06 | 0.009664 |
| 1129 | 8.011865e-06 | 0.009680 |
| 1130 | 8.011865e-06 | 0.009696 |
| 1131 | 8.011865e-06 | 0.009712 |
| 1132 | 8.011865e-06 | 0.009728 |
| 1133 | 8.011865e-06 | 0.009744 |
| 1134 | 8.011865e-06 | 0.009760 |
| 1135 | 8.011865e-06 | 0.009776 |
| 1136 | 8.011865e-06 | 0.009792 |
| 1137 | 8.011865e-06 | 0.009808 |
| 1138 | 8.011865e-06 | 0.009824 |
| 1139 | 8.011865e-06 | 0.009840 |
| 1140 | 8.011865e-06 | 0.009856 |
| 1141 | 8.011865e-06 | 0.009872 |
| 1142 | 8.011865e-06 | 0.009888 |
| 1143 | 8.011865e-06 | 0.009904 |
| 1144 | 8.011865e-06 | 0.009920 |
| 1145 | 8.011865e-06 | 0.009936 |
| 1146 | 8.014154e-06 | 0.009952 |
| 1147 | 8.016104e-06 | 0.010000 |
| 1148 | 8.016624e-06 | 0.010048 |
| 1149 | 8.022528e-06 | 0.010128 |
| 1150 | 8.025021e-06 | 0.010161 |
| 1151 | 8.030929e-06 | 0.010177 |
| 1152 | 8.035556e-06 | 0.010193 |
| 1153 | 8.035943e-06 | 0.010241 |
| 1154 | 8.036567e-06 | 0.010257 |
| 1155 | 8.036567e-06 | 0.010273 |
| 1156 | 8.036567e-06 | 0.010289 |
| 1157 | 8.036567e-06 | 0.010305 |
| 1158 | 8.036567e-06 | 0.010321 |
| 1159 | 8.036567e-06 | 0.010337 |
| 1160 | 8.036567e-06 | 0.010353 |
| 1161 | 8.036567e-06 | 0.010369 |
| 1162 | 8.036567e-06 | 0.010386 |
| 1163 | 8.036567e-06 | 0.010402 |
| 1164 | 8.036567e-06 | 0.010418 |
| 1165 | 8.036567e-06 | 0.010434 |
| 1166 | 8.036567e-06 | 0.010450 |
| 1167 | 8.036567e-06 | 0.010466 |
| 1168 | 8.036567e-06 | 0.010482 |
| 1169 | 8.036567e-06 | 0.010498 |
| 1170 | 8.038140e-06 | 0.010514 |
| 1171 | 8.038760e-06 | 0.010530 |
| 1172 | 8.050203e-06 | 0.010546 |
| 1173 | 8.050257e-06 | 0.010562 |
| 1174 | 8.050257e-06 | 0.010579 |
| 1175 | 8.053968e-06 | 0.010611 |
| 1176 | 8.055886e-06 | 0.010627 |
| 1177 | 8.055886e-06 | 0.010643 |
| 1178 | 8.055886e-06 | 0.010659 |
| 1179 | 8.055886e-06 | 0.010675 |
| 1180 | 8.055886e-06 | 0.010691 |
| 1181 | 8.055886e-06 | 0.010707 |
| 1182 | 8.055886e-06 | 0.010723 |
| 1183 | 8.055886e-06 | 0.010740 |
| 1184 | 8.055886e-06 | 0.010756 |
| 1185 | 8.055886e-06 | 0.010772 |
| 1186 | 8.055886e-06 | 0.010788 |
| 1187 | 8.055886e-06 | 0.010804 |
| 1188 | 8.055886e-06 | 0.010820 |
| 1189 | 8.055886e-06 | 0.010836 |
| 1190 | 8.055886e-06 | 0.010852 |
| 1191 | 8.055886e-06 | 0.010869 |
| 1192 | 8.055886e-06 | 0.010885 |
| 1193 | 8.057128e-06 | 0.010917 |
| 1194 | 8.064048e-06 | 0.010949 |
| 1195 | 8.071408e-06 | 0.010965 |
| 1196 | 8.071408e-06 | 0.010981 |
| 1197 | 8.071408e-06 | 0.010998 |
| 1198 | 8.071408e-06 | 0.011014 |
| 1199 | 8.071408e-06 | 0.011030 |
| 1200 | 8.071408e-06 | 0.011046 |
| 1201 | 8.071408e-06 | 0.011062 |
| 1202 | 8.071408e-06 | 0.011094 |
| 1203 | 8.071408e-06 | 0.011111 |
| 1204 | 8.071408e-06 | 0.011127 |
| 1205 | 8.071408e-06 | 0.011159 |
| 1206 | 8.071692e-06 | 0.011175 |
| 1207 | 8.078079e-06 | 0.011207 |
| 1208 | 8.078079e-06 | 0.011240 |
| 1209 | 8.078079e-06 | 0.011272 |
| 1210 | 8.078598e-06 | 0.011288 |
| 1211 | 8.084152e-06 | 0.011304 |
| 1212 | 8.084152e-06 | 0.011321 |
| 1213 | 8.084152e-06 | 0.011337 |
| 1214 | 8.084152e-06 | 0.011353 |
| 1215 | 8.084152e-06 | 0.011369 |
| 1216 | 8.084152e-06 | 0.011385 |
| 1217 | 8.084152e-06 | 0.011401 |
| 1218 | 8.084152e-06 | 0.011418 |
| 1219 | 8.085850e-06 | 0.011434 |
| 1220 | 8.087575e-06 | 0.011482 |
| 1221 | 8.088525e-06 | 0.011515 |
| 1222 | 8.090182e-06 | 0.011563 |
| 1223 | 8.091303e-06 | 0.011579 |
| 1224 | 8.091303e-06 | 0.011595 |
| 1225 | 8.092308e-06 | 0.011660 |
| 1226 | 8.093972e-06 | 0.011676 |
| 1227 | 8.093972e-06 | 0.011693 |
| 1228 | 8.094248e-06 | 0.011725 |
| 1229 | 8.094803e-06 | 0.011741 |
| 1230 | 8.094803e-06 | 0.011757 |
| 1231 | 8.094803e-06 | 0.011774 |
| 1232 | 8.094803e-06 | 0.011790 |
| 1233 | 8.094803e-06 | 0.011806 |
| 1234 | 8.094803e-06 | 0.011822 |
| 1235 | 8.094803e-06 | 0.011838 |
| 1236 | 8.094803e-06 | 0.011855 |
| 1237 | 8.094803e-06 | 0.011871 |
| 1238 | 8.100093e-06 | 0.011887 |
| 1239 | 8.103838e-06 | 0.011903 |
| 1240 | 8.103838e-06 | 0.011919 |
| 1241 | 8.103838e-06 | 0.011936 |
| 1242 | 8.103838e-06 | 0.011952 |
| 1243 | 8.103838e-06 | 0.011968 |
| 1244 | 8.103838e-06 | 0.011984 |
| 1245 | 8.103920e-06 | 0.012017 |
| 1246 | 8.106579e-06 | 0.012033 |
| 1247 | 8.108176e-06 | 0.012098 |
| 1248 | 8.110135e-06 | 0.012114 |
| 1249 | 8.110759e-06 | 0.012130 |
| 1250 | 8.111598e-06 | 0.012146 |
| 1251 | 8.111598e-06 | 0.012163 |
| 1252 | 8.111598e-06 | 0.012179 |
| 1253 | 8.111598e-06 | 0.012195 |
| 1254 | 8.112902e-06 | 0.012227 |
| 1255 | 8.115081e-06 | 0.012260 |
| 1256 | 8.118335e-06 | 0.012292 |
| 1257 | 8.118335e-06 | 0.012309 |
| 1258 | 8.118335e-06 | 0.012325 |
| 1259 | 8.118335e-06 | 0.012341 |
| 1260 | 8.119574e-06 | 0.012357 |
| 1261 | 8.119973e-06 | 0.012390 |
| 1262 | 8.121381e-06 | 0.012422 |
| 1263 | 8.121455e-06 | 0.012487 |
| 1264 | 8.124239e-06 | 0.012503 |
| 1265 | 8.124239e-06 | 0.012520 |
| 1266 | 8.124239e-06 | 0.012536 |
| 1267 | 8.124239e-06 | 0.012552 |
| 1268 | 8.126844e-06 | 0.012568 |
| 1269 | 8.129456e-06 | 0.012585 |
| 1270 | 8.134099e-06 | 0.012617 |
| 1271 | 8.134099e-06 | 0.012634 |
| 1272 | 8.134099e-06 | 0.012650 |
| 1273 | 8.134099e-06 | 0.012666 |
| 1274 | 8.135221e-06 | 0.012682 |
| 1275 | 8.135535e-06 | 0.012731 |
| 1276 | 8.138257e-06 | 0.012747 |
| 1277 | 8.139124e-06 | 0.012764 |
| 1278 | 8.139522e-06 | 0.012813 |
| 1279 | 8.145396e-06 | 0.012829 |
| 1280 | 8.145396e-06 | 0.012845 |
| 1281 | 8.148483e-06 | 0.012861 |
| 1282 | 8.148483e-06 | 0.012878 |
| 1283 | 8.148483e-06 | 0.012894 |
| 1284 | 8.148483e-06 | 0.012910 |
| 1285 | 8.148483e-06 | 0.012927 |
| 1286 | 8.148933e-06 | 0.012943 |
| 1287 | 8.151303e-06 | 0.012959 |
| 1288 | 8.151303e-06 | 0.012976 |
| 1289 | 8.151303e-06 | 0.012992 |
| 1290 | 8.153890e-06 | 0.013008 |
| 1291 | 8.154538e-06 | 0.013041 |
| 1292 | 8.158171e-06 | 0.013057 |
| 1293 | 8.158470e-06 | 0.013073 |
| 1294 | 8.158470e-06 | 0.013090 |
| 1295 | 8.158588e-06 | 0.013106 |
| 1296 | 8.159281e-06 | 0.013139 |
| 1297 | 8.160508e-06 | 0.013171 |
| 1298 | 8.162401e-06 | 0.013204 |
| 1299 | 8.164164e-06 | 0.013220 |
| 1300 | 8.164164e-06 | 0.013237 |
| 1301 | 8.168758e-06 | 0.013253 |
| 1302 | 8.170154e-06 | 0.013269 |
| 1303 | 8.175372e-06 | 0.013302 |
| 1304 | 8.175876e-06 | 0.013318 |
| 1305 | 8.175884e-06 | 0.013335 |
| 1306 | 8.177664e-06 | 0.013351 |
| 1307 | 8.178643e-06 | 0.013367 |
| 1308 | 8.178643e-06 | 0.013384 |
| 1309 | 8.178643e-06 | 0.013400 |
| 1310 | 8.183120e-06 | 0.013416 |
| 1311 | 8.185521e-06 | 0.013433 |
| 1312 | 8.185521e-06 | 0.013449 |
| 1313 | 8.185704e-06 | 0.013466 |
| 1314 | 8.195826e-06 | 0.013482 |
| 1315 | 8.200807e-06 | 0.013498 |
| 1316 | 8.206467e-06 | 0.013515 |
| 1317 | 8.226786e-06 | 0.013531 |
| 1318 | 8.232550e-06 | 0.013564 |
| 1319 | 8.233641e-06 | 0.013581 |
| 1320 | 8.275992e-06 | 0.013597 |
| 1321 | 8.276035e-06 | 0.013614 |
| 1322 | 8.330736e-06 | 0.013630 |
| 1323 | 8.337276e-06 | 0.013664 |
| 1324 | 8.347338e-06 | 0.013714 |
| 1325 | 8.347889e-06 | 0.013764 |
| 1326 | 8.382916e-06 | 0.013797 |
| 1327 | 8.391642e-06 | 0.013848 |
| 1328 | 8.503591e-06 | 0.013882 |
| 1329 | 8.677706e-06 | 0.014047 |
| 1330 | 9.045675e-06 | 0.014128 |
| 1331 | 9.057429e-06 | 0.014173 |
| 1332 | 9.080377e-06 | 0.014237 |
| 1333 | 9.143145e-06 | 0.014301 |
| 1334 | 9.190502e-06 | 0.014384 |
| 1335 | 9.206727e-06 | 0.014476 |
| 1336 | 9.265320e-06 | 0.014522 |
| 1337 | 9.369866e-06 | 0.014700 |
| 1338 | 9.404686e-06 | 0.014747 |
| 1339 | 9.411263e-06 | 0.014794 |
| 1340 | 9.430102e-06 | 0.014841 |
| 1341 | 9.446520e-06 | 0.014955 |
| 1342 | 9.475665e-06 | 0.014983 |
| 1343 | 9.499509e-06 | 0.015031 |
| 1344 | 9.511727e-06 | 0.015078 |
| 1345 | 9.541618e-06 | 0.015174 |
| 1346 | 9.558931e-06 | 0.015221 |
| 1347 | 9.572981e-06 | 0.015269 |
| 1348 | 9.665107e-06 | 0.015366 |
| 1349 | 9.683146e-06 | 0.015395 |
| 1350 | 9.683146e-06 | 0.015424 |
| 1351 | 9.683146e-06 | 0.015453 |
| 1352 | 9.689890e-06 | 0.015502 |
| 1353 | 9.692253e-06 | 0.015550 |
| 1354 | 9.710358e-06 | 0.015647 |
| 1355 | 9.737433e-06 | 0.015696 |
| 1356 | 9.743786e-06 | 0.015715 |
| 1357 | 9.747245e-06 | 0.015764 |
| 1358 | 9.764304e-06 | 0.015813 |
| 1359 | 9.764620e-06 | 0.015862 |
| 1360 | 9.856223e-06 | 0.015891 |
| 1361 | 1.004425e-05 | 0.015921 |
| 1362 | 1.006381e-05 | 0.015952 |
| 1363 | 1.007540e-05 | 0.015982 |
| 1364 | 1.008428e-05 | 0.016042 |
| 1365 | 1.008860e-05 | 0.016073 |
| 1366 | 1.009072e-05 | 0.016103 |
| 1367 | 1.011244e-05 | 0.016184 |
| 1368 | 1.016352e-05 | 0.016214 |
| 1369 | 1.019560e-05 | 0.016245 |
| 1370 | 1.022970e-05 | 0.016275 |
| 1371 | 1.022970e-05 | 0.016306 |
| 1372 | 1.022970e-05 | 0.016368 |
| 1373 | 1.022970e-05 | 0.016398 |
| 1374 | 1.023636e-05 | 0.016429 |
| 1375 | 1.024547e-05 | 0.016542 |
| 1376 | 1.028646e-05 | 0.016572 |
| 1377 | 1.030191e-05 | 0.016603 |
| 1378 | 1.032087e-05 | 0.016614 |
| 1379 | 1.032087e-05 | 0.016624 |
| 1380 | 1.032087e-05 | 0.016634 |
| 1381 | 1.034336e-05 | 0.016665 |
| 1382 | 1.035262e-05 | 0.016696 |
| 1383 | 1.035391e-05 | 0.016728 |
| 1384 | 1.036651e-05 | 0.016759 |
| 1385 | 1.042611e-05 | 0.016790 |
| 1386 | 1.042611e-05 | 0.016821 |
| 1387 | 1.043298e-05 | 0.016852 |
| 1388 | 1.043364e-05 | 0.016946 |
| 1389 | 1.043967e-05 | 0.016978 |
| 1390 | 1.045192e-05 | 0.017009 |
| 1391 | 1.047112e-05 | 0.017040 |
| 1392 | 1.048904e-05 | 0.017072 |
| 1393 | 1.052385e-05 | 0.017135 |
| 1394 | 1.056856e-05 | 0.017167 |
| 1395 | 1.057844e-05 | 0.017199 |
| 1396 | 1.059564e-05 | 0.017262 |
| 1397 | 1.059564e-05 | 0.017326 |
| 1398 | 1.059564e-05 | 0.017389 |
| 1399 | 1.059734e-05 | 0.017421 |
| 1400 | 1.060102e-05 | 0.017516 |
| 1401 | 1.061132e-05 | 0.017548 |
| 1402 | 1.061132e-05 | 0.017580 |
| 1403 | 1.061132e-05 | 0.017612 |
| 1404 | 1.062313e-05 | 0.017644 |
| 1405 | 1.063006e-05 | 0.017676 |
| 1406 | 1.063319e-05 | 0.017708 |
| 1407 | 1.064953e-05 | 0.017835 |
| 1408 | 1.066160e-05 | 0.017985 |
| 1409 | 1.066232e-05 | 0.018017 |
| 1410 | 1.067226e-05 | 0.018049 |
| 1411 | 1.067849e-05 | 0.018113 |
| 1412 | 1.069899e-05 | 0.018209 |
| 1413 | 1.071542e-05 | 0.018241 |
| 1414 | 1.071871e-05 | 0.018370 |
| 1415 | 1.073862e-05 | 0.018402 |
| 1416 | 1.074253e-05 | 0.018499 |
| 1417 | 1.075706e-05 | 0.018531 |
| 1418 | 1.075953e-05 | 0.018563 |
| 1419 | 1.077077e-05 | 0.018596 |
| 1420 | 1.078428e-05 | 0.018628 |
| 1421 | 1.078627e-05 | 0.018660 |
| 1422 | 1.084857e-05 | 0.018693 |
| 1423 | 1.086824e-05 | 0.018725 |
| 1424 | 1.087455e-05 | 0.018758 |
| 1425 | 1.088122e-05 | 0.018791 |
| 1426 | 1.088139e-05 | 0.018889 |
| 1427 | 1.094854e-05 | 0.018932 |
| 1428 | 1.098334e-05 | 0.018998 |
| 1429 | 1.099664e-05 | 0.019042 |
| 1430 | 1.104094e-05 | 0.019053 |
| 1431 | 1.104435e-05 | 0.019175 |
| 1432 | 1.104714e-05 | 0.019208 |
| 1433 | 1.105048e-05 | 0.019241 |
| 1434 | 1.109724e-05 | 0.019308 |
| 1435 | 1.114112e-05 | 0.019352 |
| 1436 | 1.118091e-05 | 0.019464 |
| 1437 | 1.128159e-05 | 0.019543 |
| 1438 | 1.129562e-05 | 0.019588 |
| 1439 | 1.132179e-05 | 0.019634 |
| 1440 | 1.145446e-05 | 0.019679 |
| 1441 | 1.145811e-05 | 0.019748 |
| 1442 | 1.148710e-05 | 0.019794 |
| 1443 | 1.152569e-05 | 0.019875 |
| 1444 | 1.156685e-05 | 0.020071 |
| 1445 | 1.162088e-05 | 0.020118 |
| 1446 | 1.163628e-05 | 0.020164 |
| 1447 | 1.175502e-05 | 0.020211 |
| 1448 | 1.175502e-05 | 0.020305 |
| 1449 | 1.175613e-05 | 0.020399 |
| 1450 | 1.179842e-05 | 0.020494 |
| 1451 | 1.180017e-05 | 0.020541 |
| 1452 | 1.182015e-05 | 0.020636 |
| 1453 | 1.184429e-05 | 0.020683 |
| 1454 | 1.184873e-05 | 0.020730 |
| 1455 | 1.186201e-05 | 0.020825 |
| 1456 | 1.189875e-05 | 0.020849 |
| 1457 | 1.193615e-05 | 0.020873 |
| 1458 | 1.193615e-05 | 0.020897 |
| 1459 | 1.193977e-05 | 0.020945 |
| 1460 | 1.197409e-05 | 0.021040 |
| 1461 | 1.198773e-05 | 0.021088 |
| 1462 | 1.198889e-05 | 0.021136 |
| 1463 | 1.201392e-05 | 0.021220 |
| 1464 | 1.206085e-05 | 0.021269 |
| 1465 | 1.206562e-05 | 0.021317 |
| 1466 | 1.209580e-05 | 0.021329 |
| 1467 | 1.213070e-05 | 0.021378 |
| 1468 | 1.214221e-05 | 0.021402 |
| 1469 | 1.214221e-05 | 0.021426 |
| 1470 | 1.216528e-05 | 0.021511 |
| 1471 | 1.220158e-05 | 0.021621 |
| 1472 | 1.220472e-05 | 0.021670 |
| 1473 | 1.222480e-05 | 0.021841 |
| 1474 | 1.223136e-05 | 0.021902 |
| 1475 | 1.227564e-05 | 0.021964 |
| 1476 | 1.229692e-05 | 0.022025 |
| 1477 | 1.233143e-05 | 0.022037 |
| 1478 | 1.246530e-05 | 0.022100 |
| 1479 | 1.248817e-05 | 0.022212 |
| 1480 | 1.249369e-05 | 0.022225 |
| 1481 | 1.250067e-05 | 0.022350 |
| 1482 | 1.254566e-05 | 0.022425 |
| 1483 | 1.259696e-05 | 0.022475 |
| 1484 | 1.260097e-05 | 0.022526 |
| 1485 | 1.260498e-05 | 0.022778 |
| 1486 | 1.262660e-05 | 0.022803 |
| 1487 | 1.264090e-05 | 0.022866 |
| 1488 | 1.269302e-05 | 0.023044 |
| 1489 | 1.270053e-05 | 0.023107 |
| 1490 | 1.275209e-05 | 0.023184 |
| 1491 | 1.280822e-05 | 0.023248 |
| 1492 | 1.281076e-05 | 0.023325 |
| 1493 | 1.283136e-05 | 0.023338 |
| 1494 | 1.283136e-05 | 0.023351 |
| 1495 | 1.283136e-05 | 0.023363 |
| 1496 | 1.283136e-05 | 0.023376 |
| 1497 | 1.283136e-05 | 0.023389 |
| 1498 | 1.283136e-05 | 0.023402 |
| 1499 | 1.283136e-05 | 0.023415 |
| 1500 | 1.283136e-05 | 0.023427 |
| 1501 | 1.283136e-05 | 0.023440 |
| 1502 | 1.284535e-05 | 0.023517 |
| 1503 | 1.288335e-05 | 0.023595 |
| 1504 | 1.302039e-05 | 0.023686 |
| 1505 | 1.317315e-05 | 0.023778 |
| 1506 | 1.318778e-05 | 0.023804 |
| 1507 | 1.318778e-05 | 0.023831 |
| 1508 | 1.318778e-05 | 0.023857 |
| 1509 | 1.318778e-05 | 0.023884 |
| 1510 | 1.322193e-05 | 0.023950 |
| 1511 | 1.328691e-05 | 0.024043 |
| 1512 | 1.332852e-05 | 0.024123 |
| 1513 | 1.333128e-05 | 0.024136 |
| 1514 | 1.340160e-05 | 0.024216 |
| 1515 | 1.344625e-05 | 0.024243 |
| 1516 | 1.347173e-05 | 0.024338 |
| 1517 | 1.354083e-05 | 0.024487 |
| 1518 | 1.357141e-05 | 0.024582 |
| 1519 | 1.360953e-05 | 0.024650 |
| 1520 | 1.366170e-05 | 0.024745 |
| 1521 | 1.374735e-05 | 0.024773 |
| 1522 | 1.387539e-05 | 0.024787 |
| 1523 | 1.387539e-05 | 0.024800 |
| 1524 | 1.387539e-05 | 0.024814 |
| 1525 | 1.387539e-05 | 0.024828 |
| 1526 | 1.388157e-05 | 0.024870 |
| 1527 | 1.388188e-05 | 0.024884 |
| 1528 | 1.388188e-05 | 0.024898 |
| 1529 | 1.388188e-05 | 0.024911 |
| 1530 | 1.388188e-05 | 0.024925 |
| 1531 | 1.388188e-05 | 0.024939 |
| 1532 | 1.388188e-05 | 0.024953 |
| 1533 | 1.388188e-05 | 0.024967 |
| 1534 | 1.389457e-05 | 0.024995 |
| 1535 | 1.394934e-05 | 0.025037 |
| 1536 | 1.395277e-05 | 0.025330 |
| 1537 | 1.395949e-05 | 0.025441 |
| 1538 | 1.396354e-05 | 0.025455 |
| 1539 | 1.396354e-05 | 0.025469 |
| 1540 | 1.396354e-05 | 0.025483 |
| 1541 | 1.396354e-05 | 0.025497 |
| 1542 | 1.396354e-05 | 0.025511 |
| 1543 | 1.396354e-05 | 0.025553 |
| 1544 | 1.396354e-05 | 0.025581 |
| 1545 | 1.396354e-05 | 0.025595 |
| 1546 | 1.396354e-05 | 0.025609 |
| 1547 | 1.396354e-05 | 0.025623 |
| 1548 | 1.396354e-05 | 0.025651 |
| 1549 | 1.396354e-05 | 0.025665 |
| 1550 | 1.396354e-05 | 0.025679 |
| 1551 | 1.396354e-05 | 0.025749 |
| 1552 | 1.396354e-05 | 0.025804 |
| 1553 | 1.396354e-05 | 0.025818 |
| 1554 | 1.396354e-05 | 0.025832 |
| 1555 | 1.396354e-05 | 0.025846 |
| 1556 | 1.396354e-05 | 0.025860 |
| 1557 | 1.396354e-05 | 0.025874 |
| 1558 | 1.396354e-05 | 0.025888 |
| 1559 | 1.396354e-05 | 0.025902 |
| 1560 | 1.396354e-05 | 0.025916 |
| 1561 | 1.396354e-05 | 0.025930 |
| 1562 | 1.396354e-05 | 0.025944 |
| 1563 | 1.396354e-05 | 0.025958 |
| 1564 | 1.396354e-05 | 0.025972 |
| 1565 | 1.396354e-05 | 0.025986 |
| 1566 | 1.396354e-05 | 0.026014 |
| 1567 | 1.396354e-05 | 0.026028 |
| 1568 | 1.396354e-05 | 0.026042 |
| 1569 | 1.396354e-05 | 0.026056 |
| 1570 | 1.396354e-05 | 0.026070 |
| 1571 | 1.396354e-05 | 0.026084 |
| 1572 | 1.396354e-05 | 0.026098 |
| 1573 | 1.396354e-05 | 0.026140 |
| 1574 | 1.396354e-05 | 0.026181 |
| 1575 | 1.396354e-05 | 0.026223 |
| 1576 | 1.396354e-05 | 0.026237 |
| 1577 | 1.401093e-05 | 0.026503 |
| 1578 | 1.412185e-05 | 0.026560 |
| 1579 | 1.415874e-05 | 0.026574 |
| 1580 | 1.416004e-05 | 0.026687 |
| 1581 | 1.416031e-05 | 0.026730 |
| 1582 | 1.416031e-05 | 0.026772 |
| 1583 | 1.416196e-05 | 0.026815 |
| 1584 | 1.417668e-05 | 0.026886 |
| 1585 | 1.417703e-05 | 0.026900 |
| 1586 | 1.422496e-05 | 0.026914 |
| 1587 | 1.423356e-05 | 0.026957 |
| 1588 | 1.425706e-05 | 0.026971 |
| 1589 | 1.426668e-05 | 0.027000 |
| 1590 | 1.428143e-05 | 0.027071 |
| 1591 | 1.430120e-05 | 0.027114 |
| 1592 | 1.430725e-05 | 0.027143 |
| 1593 | 1.430873e-05 | 0.027185 |
| 1594 | 1.430873e-05 | 0.027228 |
| 1595 | 1.432158e-05 | 0.027300 |
| 1596 | 1.434066e-05 | 0.027343 |
| 1597 | 1.434841e-05 | 0.027372 |
| 1598 | 1.436713e-05 | 0.027386 |
| 1599 | 1.441109e-05 | 0.027444 |
| 1600 | 1.443610e-05 | 0.027473 |
| 1601 | 1.445237e-05 | 0.027487 |
| 1602 | 1.445787e-05 | 0.027502 |
| 1603 | 1.445787e-05 | 0.027516 |
| 1604 | 1.445787e-05 | 0.027530 |
| 1605 | 1.445787e-05 | 0.027545 |
| 1606 | 1.445787e-05 | 0.027559 |
| 1607 | 1.445787e-05 | 0.027574 |
| 1608 | 1.445787e-05 | 0.027603 |
| 1609 | 1.445787e-05 | 0.027632 |
| 1610 | 1.445787e-05 | 0.027646 |
| 1611 | 1.450757e-05 | 0.027690 |
| 1612 | 1.450757e-05 | 0.027733 |
| 1613 | 1.450757e-05 | 0.027777 |
| 1614 | 1.451817e-05 | 0.028023 |
| 1615 | 1.453953e-05 | 0.028038 |
| 1616 | 1.453953e-05 | 0.028053 |
| 1617 | 1.457771e-05 | 0.028096 |
| 1618 | 1.458877e-05 | 0.028169 |
| 1619 | 1.458877e-05 | 0.028242 |
| 1620 | 1.458877e-05 | 0.028315 |
| 1621 | 1.459127e-05 | 0.028330 |
| 1622 | 1.461448e-05 | 0.028476 |
| 1623 | 1.462187e-05 | 0.028637 |
| 1624 | 1.464481e-05 | 0.028725 |
| 1625 | 1.465592e-05 | 0.028754 |
| 1626 | 1.466248e-05 | 0.028813 |
| 1627 | 1.466441e-05 | 0.028842 |
| 1628 | 1.468288e-05 | 0.028901 |
| 1629 | 1.469523e-05 | 0.028959 |
| 1630 | 1.469846e-05 | 0.028989 |
| 1631 | 1.469846e-05 | 0.029018 |
| 1632 | 1.469846e-05 | 0.029048 |
| 1633 | 1.469846e-05 | 0.029077 |
| 1634 | 1.472823e-05 | 0.029092 |
| 1635 | 1.473234e-05 | 0.029121 |
| 1636 | 1.474623e-05 | 0.029151 |
| 1637 | 1.475700e-05 | 0.029165 |
| 1638 | 1.475700e-05 | 0.029180 |
| 1639 | 1.476991e-05 | 0.029224 |
| 1640 | 1.478638e-05 | 0.029269 |
| 1641 | 1.478638e-05 | 0.029313 |
| 1642 | 1.479001e-05 | 0.029358 |
| 1643 | 1.479580e-05 | 0.029432 |
| 1644 | 1.479899e-05 | 0.029491 |
| 1645 | 1.482144e-05 | 0.029506 |
| 1646 | 1.482144e-05 | 0.029520 |
| 1647 | 1.482144e-05 | 0.029535 |
| 1648 | 1.482144e-05 | 0.029550 |
| 1649 | 1.482144e-05 | 0.029565 |
| 1650 | 1.483683e-05 | 0.029595 |
| 1651 | 1.484170e-05 | 0.029654 |
| 1652 | 1.485483e-05 | 0.029698 |
| 1653 | 1.485483e-05 | 0.029743 |
| 1654 | 1.486998e-05 | 0.029773 |
| 1655 | 1.489706e-05 | 0.029803 |
| 1656 | 1.489706e-05 | 0.029832 |
| 1657 | 1.489706e-05 | 0.029862 |
| 1658 | 1.491666e-05 | 0.029952 |
| 1659 | 1.492286e-05 | 0.030011 |
| 1660 | 1.492976e-05 | 0.030041 |
| 1661 | 1.493860e-05 | 0.030146 |
| 1662 | 1.495289e-05 | 0.030191 |
| 1663 | 1.495398e-05 | 0.030265 |
| 1664 | 1.495717e-05 | 0.030295 |
| 1665 | 1.496935e-05 | 0.030340 |
| 1666 | 1.498544e-05 | 0.030460 |
| 1667 | 1.499450e-05 | 0.030475 |
| 1668 | 1.501839e-05 | 0.030520 |
| 1669 | 1.502747e-05 | 0.030565 |
| 1670 | 1.502749e-05 | 0.030655 |
| 1671 | 1.504215e-05 | 0.030685 |
| 1672 | 1.504215e-05 | 0.030746 |
| 1673 | 1.506606e-05 | 0.030776 |
| 1674 | 1.507175e-05 | 0.030791 |
| 1675 | 1.507175e-05 | 0.030806 |
| 1676 | 1.507549e-05 | 0.030836 |
| 1677 | 1.508067e-05 | 0.030866 |
| 1678 | 1.509571e-05 | 0.030896 |
| 1679 | 1.509571e-05 | 0.030912 |
| 1680 | 1.509571e-05 | 0.030927 |
| 1681 | 1.509571e-05 | 0.030942 |
| 1682 | 1.509571e-05 | 0.030957 |
| 1683 | 1.509571e-05 | 0.030972 |
| 1684 | 1.509571e-05 | 0.030987 |
| 1685 | 1.509571e-05 | 0.031002 |
| 1686 | 1.509571e-05 | 0.031062 |
| 1687 | 1.509571e-05 | 0.031093 |
| 1688 | 1.509571e-05 | 0.031108 |
| 1689 | 1.509571e-05 | 0.031123 |
| 1690 | 1.509571e-05 | 0.031138 |
| 1691 | 1.509571e-05 | 0.031153 |
| 1692 | 1.509571e-05 | 0.031168 |
| 1693 | 1.509571e-05 | 0.031183 |
| 1694 | 1.509571e-05 | 0.031198 |
| 1695 | 1.509571e-05 | 0.031213 |
| 1696 | 1.509571e-05 | 0.031244 |
| 1697 | 1.509571e-05 | 0.031259 |
| 1698 | 1.509571e-05 | 0.031289 |
| 1699 | 1.509571e-05 | 0.031304 |
| 1700 | 1.509571e-05 | 0.031319 |
| 1701 | 1.509571e-05 | 0.031349 |
| 1702 | 1.509571e-05 | 0.031395 |
| 1703 | 1.509571e-05 | 0.031425 |
| 1704 | 1.510019e-05 | 0.031485 |
| 1705 | 1.510019e-05 | 0.031515 |
| 1706 | 1.510214e-05 | 0.031561 |
| 1707 | 1.510467e-05 | 0.031591 |
| 1708 | 1.512668e-05 | 0.031621 |
| 1709 | 1.514420e-05 | 0.031651 |
| 1710 | 1.515141e-05 | 0.031682 |
| 1711 | 1.515141e-05 | 0.031712 |
| 1712 | 1.515450e-05 | 0.031727 |
| 1713 | 1.518610e-05 | 0.031742 |
| 1714 | 1.518610e-05 | 0.031758 |
| 1715 | 1.518610e-05 | 0.031773 |
| 1716 | 1.519778e-05 | 0.032077 |
| 1717 | 1.520763e-05 | 0.032107 |
| 1718 | 1.521667e-05 | 0.032138 |
| 1719 | 1.523509e-05 | 0.032351 |
| 1720 | 1.525017e-05 | 0.032427 |
| 1721 | 1.525163e-05 | 0.032549 |
| 1722 | 1.525459e-05 | 0.032564 |
| 1723 | 1.525459e-05 | 0.032580 |
| 1724 | 1.525462e-05 | 0.032641 |
| 1725 | 1.525834e-05 | 0.032686 |
| 1726 | 1.527262e-05 | 0.032732 |
| 1727 | 1.527262e-05 | 0.032778 |
| 1728 | 1.527262e-05 | 0.032824 |
| 1729 | 1.528254e-05 | 0.032870 |
| 1730 | 1.528403e-05 | 0.032885 |
| 1731 | 1.529373e-05 | 0.032977 |
| 1732 | 1.530578e-05 | 0.033023 |
| 1733 | 1.532481e-05 | 0.033099 |
| 1734 | 1.533117e-05 | 0.033130 |
| 1735 | 1.534455e-05 | 0.033161 |
| 1736 | 1.534455e-05 | 0.033191 |
| 1737 | 1.534455e-05 | 0.033222 |
| 1738 | 1.534455e-05 | 0.033253 |
| 1739 | 1.534455e-05 | 0.033283 |
| 1740 | 1.536701e-05 | 0.033299 |
| 1741 | 1.536765e-05 | 0.033314 |
| 1742 | 1.536765e-05 | 0.033329 |
| 1743 | 1.539230e-05 | 0.033345 |
| 1744 | 1.539399e-05 | 0.033360 |
| 1745 | 1.539399e-05 | 0.033376 |
| 1746 | 1.540550e-05 | 0.033422 |
| 1747 | 1.541081e-05 | 0.033453 |
| 1748 | 1.542003e-05 | 0.033499 |
| 1749 | 1.543616e-05 | 0.033530 |
| 1750 | 1.543707e-05 | 0.033592 |
| 1751 | 1.544966e-05 | 0.033638 |
| 1752 | 1.546400e-05 | 0.033669 |
| 1753 | 1.547396e-05 | 0.033700 |
| 1754 | 1.547484e-05 | 0.033715 |
| 1755 | 1.547816e-05 | 0.033777 |
| 1756 | 1.548232e-05 | 0.033793 |
| 1757 | 1.548376e-05 | 0.033855 |
| 1758 | 1.550288e-05 | 0.033870 |
| 1759 | 1.550378e-05 | 0.033886 |
| 1760 | 1.551504e-05 | 0.033917 |
| 1761 | 1.551504e-05 | 0.033948 |
| 1762 | 1.551504e-05 | 0.033963 |
| 1763 | 1.551504e-05 | 0.034010 |
| 1764 | 1.551504e-05 | 0.034025 |
| 1765 | 1.551504e-05 | 0.034041 |
| 1766 | 1.551504e-05 | 0.034056 |
| 1767 | 1.551504e-05 | 0.034072 |
| 1768 | 1.551504e-05 | 0.034087 |
| 1769 | 1.551504e-05 | 0.034103 |
| 1770 | 1.551504e-05 | 0.034118 |
| 1771 | 1.551504e-05 | 0.034134 |
| 1772 | 1.551504e-05 | 0.034149 |
| 1773 | 1.551504e-05 | 0.034165 |
| 1774 | 1.551504e-05 | 0.034180 |
| 1775 | 1.551504e-05 | 0.034196 |
| 1776 | 1.551504e-05 | 0.034211 |
| 1777 | 1.551504e-05 | 0.034227 |
| 1778 | 1.551504e-05 | 0.034242 |
| 1779 | 1.551504e-05 | 0.034258 |
| 1780 | 1.551504e-05 | 0.034274 |
| 1781 | 1.551504e-05 | 0.034289 |
| 1782 | 1.551504e-05 | 0.034305 |
| 1783 | 1.551504e-05 | 0.034320 |
| 1784 | 1.551504e-05 | 0.034336 |
| 1785 | 1.551504e-05 | 0.034351 |
| 1786 | 1.551619e-05 | 0.034382 |
| 1787 | 1.555148e-05 | 0.034413 |
| 1788 | 1.555749e-05 | 0.034460 |
| 1789 | 1.555834e-05 | 0.034475 |
| 1790 | 1.556222e-05 | 0.034538 |
| 1791 | 1.557096e-05 | 0.034569 |
| 1792 | 1.559249e-05 | 0.034584 |
| 1793 | 1.559715e-05 | 0.034631 |
| 1794 | 1.560721e-05 | 0.034662 |
| 1795 | 1.561330e-05 | 0.034850 |
| 1796 | 1.563043e-05 | 0.034881 |
| 1797 | 1.563331e-05 | 0.034897 |
| 1798 | 1.563382e-05 | 0.034944 |
| 1799 | 1.563916e-05 | 0.034975 |
| 1800 | 1.563916e-05 | 0.035006 |
| 1801 | 1.564505e-05 | 0.035037 |
| 1802 | 1.566566e-05 | 0.035053 |
| 1803 | 1.566566e-05 | 0.035069 |
| 1804 | 1.566566e-05 | 0.035084 |
| 1805 | 1.567336e-05 | 0.035131 |
| 1806 | 1.567336e-05 | 0.035225 |
| 1807 | 1.570739e-05 | 0.035351 |
| 1808 | 1.572148e-05 | 0.035383 |
| 1809 | 1.572806e-05 | 0.035414 |
| 1810 | 1.573356e-05 | 0.035430 |
| 1811 | 1.573356e-05 | 0.035446 |
| 1812 | 1.573356e-05 | 0.035461 |
| 1813 | 1.573356e-05 | 0.035477 |
| 1814 | 1.573356e-05 | 0.035493 |
| 1815 | 1.573356e-05 | 0.035508 |
| 1816 | 1.573356e-05 | 0.035524 |
| 1817 | 1.573356e-05 | 0.035540 |
| 1818 | 1.573356e-05 | 0.035556 |
| 1819 | 1.573356e-05 | 0.035587 |
| 1820 | 1.573356e-05 | 0.035603 |
| 1821 | 1.573397e-05 | 0.035634 |
| 1822 | 1.573861e-05 | 0.035682 |
| 1823 | 1.574429e-05 | 0.035713 |
| 1824 | 1.575822e-05 | 0.035745 |
| 1825 | 1.576260e-05 | 0.035808 |
| 1826 | 1.577190e-05 | 0.035839 |
| 1827 | 1.577928e-05 | 0.035855 |
| 1828 | 1.578545e-05 | 0.035902 |
| 1829 | 1.581931e-05 | 0.035997 |
| 1830 | 1.583046e-05 | 0.036092 |
| 1831 | 1.586765e-05 | 0.036108 |
| 1832 | 1.586765e-05 | 0.036124 |
| 1833 | 1.586765e-05 | 0.036140 |
| 1834 | 1.586765e-05 | 0.036156 |
| 1835 | 1.586765e-05 | 0.036172 |
| 1836 | 1.586765e-05 | 0.036187 |
| 1837 | 1.586765e-05 | 0.036203 |
| 1838 | 1.586765e-05 | 0.036219 |
| 1839 | 1.587906e-05 | 0.036283 |
| 1840 | 1.589241e-05 | 0.036314 |
| 1841 | 1.593856e-05 | 0.036490 |
| 1842 | 1.593953e-05 | 0.036745 |
| 1843 | 1.595833e-05 | 0.036761 |
| 1844 | 1.595833e-05 | 0.036777 |
| 1845 | 1.595833e-05 | 0.036793 |
| 1846 | 1.595833e-05 | 0.036809 |
| 1847 | 1.599348e-05 | 0.036841 |
| 1848 | 1.599931e-05 | 0.036921 |
| 1849 | 1.601894e-05 | 0.036953 |
| 1850 | 1.602373e-05 | 0.036969 |
| 1851 | 1.602373e-05 | 0.036985 |
| 1852 | 1.602373e-05 | 0.037001 |
| 1853 | 1.603964e-05 | 0.037017 |
| 1854 | 1.604292e-05 | 0.037033 |
| 1855 | 1.607313e-05 | 0.037049 |
| 1856 | 1.608145e-05 | 0.037129 |
| 1857 | 1.614282e-05 | 0.037145 |
| 1858 | 1.614282e-05 | 0.037162 |
| 1859 | 1.615616e-05 | 0.037194 |
| 1860 | 1.616744e-05 | 0.037242 |
| 1861 | 1.617309e-05 | 0.037339 |
| 1862 | 1.617550e-05 | 0.037453 |
| 1863 | 1.617970e-05 | 0.037501 |
| 1864 | 1.618881e-05 | 0.037534 |
| 1865 | 1.618961e-05 | 0.037550 |
| 1866 | 1.621630e-05 | 0.037631 |
| 1867 | 1.622789e-05 | 0.037680 |
| 1868 | 1.623667e-05 | 0.037696 |
| 1869 | 1.624840e-05 | 0.037744 |
| 1870 | 1.625891e-05 | 0.037761 |
| 1871 | 1.626133e-05 | 0.037793 |
| 1872 | 1.629697e-05 | 0.037810 |
| 1873 | 1.678355e-05 | 0.037843 |
| 1874 | 1.694636e-05 | 0.037979 |
| 1875 | 1.697698e-05 | 0.038064 |
| 1876 | 1.713899e-05 | 0.038235 |
| 1877 | 1.717113e-05 | 0.038321 |
| 1878 | 1.731998e-05 | 0.038373 |
| 1879 | 1.737035e-05 | 0.038442 |
| 1880 | 1.741993e-05 | 0.038512 |
| 1881 | 1.749841e-05 | 0.038599 |
| 1882 | 1.769738e-05 | 0.038653 |
| 1883 | 1.781995e-05 | 0.038759 |
| 1884 | 1.782834e-05 | 0.038902 |
| 1885 | 1.783179e-05 | 0.039045 |
| 1886 | 1.795312e-05 | 0.039170 |
| 1887 | 1.798231e-05 | 0.039260 |
| 1888 | 1.802804e-05 | 0.039314 |
| 1889 | 1.809431e-05 | 0.039405 |
| 1890 | 1.819447e-05 | 0.039514 |
| 1891 | 1.822033e-05 | 0.039623 |
| 1892 | 1.825123e-05 | 0.039788 |
| 1893 | 1.825153e-05 | 0.040025 |
| 1894 | 1.826001e-05 | 0.040043 |
| 1895 | 1.826001e-05 | 0.040061 |
| 1896 | 1.826001e-05 | 0.040080 |
| 1897 | 1.826001e-05 | 0.040098 |
| 1898 | 1.829879e-05 | 0.040171 |
| 1899 | 1.847050e-05 | 0.040245 |
| 1900 | 1.858001e-05 | 0.040301 |
| 1901 | 1.858966e-05 | 0.040357 |
| 1902 | 1.859913e-05 | 0.040431 |
| 1903 | 1.861805e-05 | 0.040487 |
| 1904 | 1.861805e-05 | 0.040543 |
| 1905 | 1.864940e-05 | 0.040766 |
| 1906 | 1.877647e-05 | 0.040898 |
| 1907 | 1.879314e-05 | 0.040954 |
| 1908 | 1.881060e-05 | 0.041011 |
| 1909 | 1.887926e-05 | 0.041086 |
| 1910 | 1.890779e-05 | 0.041124 |
| 1911 | 1.894452e-05 | 0.041200 |
| 1912 | 1.899839e-05 | 0.041276 |
| 1913 | 1.901086e-05 | 0.041675 |
| 1914 | 1.907922e-05 | 0.041847 |
| 1915 | 1.909766e-05 | 0.041999 |
| 1916 | 1.914293e-05 | 0.042057 |
| 1917 | 1.915020e-05 | 0.042134 |
| 1918 | 1.916722e-05 | 0.042191 |
| 1919 | 1.919386e-05 | 0.042249 |
| 1920 | 1.923343e-05 | 0.042383 |
| 1921 | 1.927908e-05 | 0.042460 |
| 1922 | 1.933608e-05 | 0.042518 |
| 1923 | 1.935164e-05 | 0.042538 |
| 1924 | 1.935520e-05 | 0.042596 |
| 1925 | 1.936558e-05 | 0.042654 |
| 1926 | 1.936659e-05 | 0.042751 |
| 1927 | 1.939948e-05 | 0.043197 |
| 1928 | 1.940303e-05 | 0.043255 |
| 1929 | 1.946546e-05 | 0.043450 |
| 1930 | 1.948400e-05 | 0.043489 |
| 1931 | 1.956429e-05 | 0.043528 |
| 1932 | 1.957659e-05 | 0.043587 |
| 1933 | 1.957990e-05 | 0.043724 |
| 1934 | 1.959795e-05 | 0.043782 |
| 1935 | 1.964517e-05 | 0.043920 |
| 1936 | 1.965881e-05 | 0.043940 |
| 1937 | 1.966025e-05 | 0.044058 |
| 1938 | 1.970760e-05 | 0.044176 |
| 1939 | 1.971607e-05 | 0.044373 |
| 1940 | 1.975390e-05 | 0.044511 |
| 1941 | 1.981313e-05 | 0.044551 |
| 1942 | 1.981313e-05 | 0.044591 |
| 1943 | 1.991555e-05 | 0.044630 |
| 1944 | 1.991594e-05 | 0.044670 |
| 1945 | 1.992740e-05 | 0.044810 |
| 1946 | 2.012762e-05 | 0.044870 |
| 1947 | 2.013358e-05 | 0.044930 |
| 1948 | 2.017100e-05 | 0.045193 |
| 1949 | 2.026265e-05 | 0.045294 |
| 1950 | 2.030998e-05 | 0.045517 |
| 1951 | 2.036620e-05 | 0.045558 |
| 1952 | 2.037182e-05 | 0.045803 |
| 1953 | 2.044156e-05 | 0.045823 |
| 1954 | 2.056141e-05 | 0.045926 |
| 1955 | 2.063124e-05 | 0.045988 |
| 1956 | 2.064175e-05 | 0.046008 |
| 1957 | 2.064175e-05 | 0.046029 |
| 1958 | 2.066030e-05 | 0.046091 |
| 1959 | 2.068038e-05 | 0.046153 |
| 1960 | 2.075590e-05 | 0.046278 |
| 1961 | 2.083384e-05 | 0.046382 |
| 1962 | 2.083980e-05 | 0.046465 |
| 1963 | 2.084607e-05 | 0.046569 |
| 1964 | 2.086372e-05 | 0.046715 |
| 1965 | 2.095688e-05 | 0.046757 |
| 1966 | 2.101549e-05 | 0.047094 |
| 1967 | 2.103785e-05 | 0.047136 |
| 1968 | 2.117897e-05 | 0.047199 |
| 1969 | 2.122856e-05 | 0.047263 |
| 1970 | 2.123580e-05 | 0.047305 |
| 1971 | 2.143467e-05 | 0.047370 |
| 1972 | 2.149601e-05 | 0.047434 |
| 1973 | 2.151049e-05 | 0.047499 |
| 1974 | 2.155320e-05 | 0.047542 |
| 1975 | 2.166016e-05 | 0.047628 |
| 1976 | 2.166877e-05 | 0.047672 |
| 1977 | 2.171987e-05 | 0.047715 |
| 1978 | 2.172816e-05 | 0.047759 |
| 1979 | 2.176517e-05 | 0.047846 |
| 1980 | 2.179411e-05 | 0.048107 |
| 1981 | 2.180439e-05 | 0.048151 |
| 1982 | 2.180903e-05 | 0.048194 |
| 1983 | 2.187830e-05 | 0.048260 |
| 1984 | 2.188502e-05 | 0.048326 |
| 1985 | 2.196943e-05 | 0.048436 |
| 1986 | 2.206599e-05 | 0.048612 |
| 1987 | 2.208187e-05 | 0.048634 |
| 1988 | 2.213731e-05 | 0.048678 |
| 1989 | 2.221796e-05 | 0.048723 |
| 1990 | 2.224556e-05 | 0.048767 |
| 1991 | 2.229067e-05 | 0.048812 |
| 1992 | 2.236761e-05 | 0.048857 |
| 1993 | 2.239435e-05 | 0.048879 |
| 1994 | 2.239435e-05 | 0.048901 |
| 1995 | 2.243856e-05 | 0.048969 |
| 1996 | 2.244482e-05 | 0.049171 |
| 1997 | 2.246983e-05 | 0.049283 |
| 1998 | 2.250516e-05 | 0.049328 |
| 1999 | 2.251931e-05 | 0.049418 |
| 2000 | 2.260763e-05 | 0.049441 |
| 2001 | 2.265252e-05 | 0.049486 |
| 2002 | 2.265461e-05 | 0.049531 |
| 2003 | 2.265901e-05 | 0.049645 |
| 2004 | 2.269018e-05 | 0.049940 |
| 2005 | 2.272931e-05 | 0.050008 |
| 2006 | 2.284549e-05 | 0.050282 |
| 2007 | 2.288496e-05 | 0.050328 |
| 2008 | 2.288832e-05 | 0.050580 |
| 2009 | 2.293466e-05 | 0.050625 |
| 2010 | 2.294639e-05 | 0.050671 |
| 2011 | 2.297227e-05 | 0.050694 |
| 2012 | 2.309045e-05 | 0.050833 |
| 2013 | 2.310456e-05 | 0.051018 |
| 2014 | 2.314418e-05 | 0.051110 |
| 2015 | 2.317241e-05 | 0.051180 |
| 2016 | 2.333217e-05 | 0.051343 |
| 2017 | 2.334573e-05 | 0.051483 |
| 2018 | 2.335641e-05 | 0.051623 |
| 2019 | 2.359057e-05 | 0.051647 |
| 2020 | 2.364128e-05 | 0.051718 |
| 2021 | 2.370239e-05 | 0.051836 |
| 2022 | 2.377144e-05 | 0.051860 |
| 2023 | 2.379750e-05 | 0.051884 |
| 2024 | 2.379750e-05 | 0.051908 |
| 2025 | 2.379750e-05 | 0.051932 |
| 2026 | 2.381457e-05 | 0.052003 |
| 2027 | 2.387229e-05 | 0.052027 |
| 2028 | 2.387229e-05 | 0.052051 |
| 2029 | 2.388701e-05 | 0.052146 |
| 2030 | 2.393749e-05 | 0.052194 |
| 2031 | 2.416444e-05 | 0.052218 |
| 2032 | 2.417197e-05 | 0.052363 |
| 2033 | 2.428441e-05 | 0.052388 |
| 2034 | 2.428441e-05 | 0.052412 |
| 2035 | 2.428441e-05 | 0.052436 |
| 2036 | 2.428441e-05 | 0.052460 |
| 2037 | 2.428441e-05 | 0.052485 |
| 2038 | 2.428441e-05 | 0.052509 |
| 2039 | 2.429590e-05 | 0.052849 |
| 2040 | 2.431619e-05 | 0.052898 |
| 2041 | 2.431781e-05 | 0.052971 |
| 2042 | 2.433706e-05 | 0.053044 |
| 2043 | 2.434832e-05 | 0.053092 |
| 2044 | 2.440363e-05 | 0.053141 |
| 2045 | 2.455656e-05 | 0.053166 |
| 2046 | 2.455656e-05 | 0.053190 |
| 2047 | 2.472264e-05 | 0.053413 |
| 2048 | 2.476481e-05 | 0.053685 |
| 2049 | 2.482650e-05 | 0.053859 |
| 2050 | 2.490014e-05 | 0.053934 |
| 2051 | 2.490076e-05 | 0.054033 |
| 2052 | 2.514115e-05 | 0.054084 |
| 2053 | 2.516827e-05 | 0.054134 |
| 2054 | 2.520092e-05 | 0.054184 |
| 2055 | 2.525320e-05 | 0.054210 |
| 2056 | 2.525909e-05 | 0.054311 |
| 2057 | 2.527297e-05 | 0.054462 |
| 2058 | 2.532409e-05 | 0.054538 |
| 2059 | 2.535487e-05 | 0.054868 |
| 2060 | 2.538749e-05 | 0.055147 |
| 2061 | 2.552156e-05 | 0.055198 |
| 2062 | 2.556566e-05 | 0.055275 |
| 2063 | 2.557478e-05 | 0.055326 |
| 2064 | 2.562838e-05 | 0.055403 |
| 2065 | 2.563349e-05 | 0.055429 |
| 2066 | 2.566271e-05 | 0.055454 |
| 2067 | 2.566271e-05 | 0.055480 |
| 2068 | 2.566271e-05 | 0.055506 |
| 2069 | 2.566271e-05 | 0.055531 |
| 2070 | 2.571820e-05 | 0.055583 |
| 2071 | 2.575742e-05 | 0.055660 |
| 2072 | 2.578797e-05 | 0.055841 |
| 2073 | 2.579336e-05 | 0.055944 |
| 2074 | 2.581146e-05 | 0.055969 |
| 2075 | 2.581146e-05 | 0.055995 |
| 2076 | 2.589350e-05 | 0.056280 |
| 2077 | 2.592439e-05 | 0.056306 |
| 2078 | 2.593054e-05 | 0.056358 |
| 2079 | 2.594882e-05 | 0.056514 |
| 2080 | 2.594953e-05 | 0.056540 |
| 2081 | 2.603661e-05 | 0.056592 |
| 2082 | 2.609463e-05 | 0.056618 |
| 2083 | 2.612521e-05 | 0.056696 |
| 2084 | 2.615919e-05 | 0.056722 |
| 2085 | 2.631060e-05 | 0.056749 |
| 2086 | 2.637557e-05 | 0.056775 |
| 2087 | 2.637557e-05 | 0.056801 |
| 2088 | 2.637557e-05 | 0.056828 |
| 2089 | 2.659721e-05 | 0.056881 |
| 2090 | 2.659979e-05 | 0.056987 |
| 2091 | 2.660940e-05 | 0.057094 |
| 2092 | 2.673506e-05 | 0.057147 |
| 2093 | 2.673770e-05 | 0.057174 |
| 2094 | 2.689250e-05 | 0.057201 |
| 2095 | 2.697501e-05 | 0.057255 |
| 2096 | 2.701338e-05 | 0.057282 |
| 2097 | 2.711366e-05 | 0.057363 |
| 2098 | 2.712599e-05 | 0.057390 |
| 2099 | 2.715024e-05 | 0.057526 |
| 2100 | 2.716020e-05 | 0.057553 |
| 2101 | 2.718084e-05 | 0.057635 |
| 2102 | 2.724034e-05 | 0.057689 |
| 2103 | 2.737247e-05 | 0.057744 |
| 2104 | 2.738602e-05 | 0.058155 |
| 2105 | 2.745435e-05 | 0.058210 |
| 2106 | 2.774530e-05 | 0.058237 |
| 2107 | 2.774854e-05 | 0.058265 |
| 2108 | 2.776376e-05 | 0.058293 |
| 2109 | 2.778915e-05 | 0.058321 |
| 2110 | 2.778915e-05 | 0.058348 |
| 2111 | 2.778915e-05 | 0.058376 |
| 2112 | 2.780414e-05 | 0.058710 |
| 2113 | 2.787307e-05 | 0.058738 |
| 2114 | 2.789289e-05 | 0.058766 |
| 2115 | 2.792707e-05 | 0.058794 |
| 2116 | 2.792707e-05 | 0.058822 |
| 2117 | 2.792707e-05 | 0.058849 |
| 2118 | 2.792707e-05 | 0.058877 |
| 2119 | 2.792707e-05 | 0.058961 |
| 2120 | 2.810085e-05 | 0.059102 |
| 2121 | 2.812638e-05 | 0.059889 |
| 2122 | 2.816325e-05 | 0.059917 |
| 2123 | 2.825954e-05 | 0.059946 |
| 2124 | 2.841313e-05 | 0.059974 |
| 2125 | 2.844292e-05 | 0.060002 |
| 2126 | 2.869682e-05 | 0.060031 |
| 2127 | 2.869682e-05 | 0.060060 |
| 2128 | 2.878149e-05 | 0.060089 |
| 2129 | 2.879080e-05 | 0.060146 |
| 2130 | 2.879080e-05 | 0.060204 |
| 2131 | 2.880215e-05 | 0.060348 |
| 2132 | 2.880244e-05 | 0.060377 |
| 2133 | 2.894763e-05 | 0.060695 |
| 2134 | 2.906695e-05 | 0.060724 |
| 2135 | 2.912945e-05 | 0.060986 |
| 2136 | 2.914606e-05 | 0.061015 |
| 2137 | 2.916773e-05 | 0.061132 |
| 2138 | 2.921163e-05 | 0.061191 |
| 2139 | 2.922027e-05 | 0.061278 |
| 2140 | 2.922124e-05 | 0.061307 |
| 2141 | 2.922965e-05 | 0.061541 |
| 2142 | 2.927983e-05 | 0.061571 |
| 2143 | 2.928802e-05 | 0.061658 |
| 2144 | 2.939692e-05 | 0.061688 |
| 2145 | 2.939692e-05 | 0.061717 |
| 2146 | 2.952455e-05 | 0.061865 |
| 2147 | 2.968778e-05 | 0.061954 |
| 2148 | 2.984349e-05 | 0.061984 |
| 2149 | 2.984573e-05 | 0.062043 |
| 2150 | 2.985381e-05 | 0.062163 |
| 2151 | 2.994121e-05 | 0.062223 |
| 2152 | 3.014351e-05 | 0.062253 |
| 2153 | 3.016481e-05 | 0.062374 |
| 2154 | 3.019711e-05 | 0.062434 |
| 2155 | 3.036515e-05 | 0.062525 |
| 2156 | 3.044787e-05 | 0.062555 |
| 2157 | 3.051870e-05 | 0.062739 |
| 2158 | 3.058037e-05 | 0.062769 |
| 2159 | 3.067271e-05 | 0.062830 |
| 2160 | 3.068350e-05 | 0.062892 |
| 2161 | 3.077473e-05 | 0.063200 |
| 2162 | 3.086600e-05 | 0.063230 |
| 2163 | 3.096262e-05 | 0.063261 |
| 2164 | 3.096262e-05 | 0.063292 |
| 2165 | 3.101108e-05 | 0.063323 |
| 2166 | 3.108523e-05 | 0.063448 |
| 2167 | 3.113769e-05 | 0.063603 |
| 2168 | 3.119158e-05 | 0.063666 |
| 2169 | 3.123423e-05 | 0.063697 |
| 2170 | 3.127518e-05 | 0.063791 |
| 2171 | 3.170255e-05 | 0.063886 |
| 2172 | 3.170972e-05 | 0.063981 |
| 2173 | 3.186839e-05 | 0.064045 |
| 2174 | 3.200631e-05 | 0.064077 |
| 2175 | 3.203050e-05 | 0.064237 |
| 2176 | 3.222354e-05 | 0.064269 |
| 2177 | 3.222354e-05 | 0.064301 |
| 2178 | 3.222354e-05 | 0.064334 |
| 2179 | 3.236408e-05 | 0.064690 |
| 2180 | 3.238660e-05 | 0.064819 |
| 2181 | 3.264448e-05 | 0.064885 |
| 2182 | 3.268537e-05 | 0.064950 |
| 2183 | 3.285515e-05 | 0.065377 |
| 2184 | 3.298229e-05 | 0.065674 |
| 2185 | 3.312281e-05 | 0.065707 |
| 2186 | 3.313381e-05 | 0.065773 |
| 2187 | 3.314211e-05 | 0.065906 |
| 2188 | 3.347311e-05 | 0.065939 |
| 2189 | 3.347311e-05 | 0.065973 |
| 2190 | 3.368873e-05 | 0.066040 |
| 2191 | 3.383893e-05 | 0.066074 |
| 2192 | 3.391735e-05 | 0.066481 |
| 2193 | 3.411679e-05 | 0.066754 |
| 2194 | 3.437178e-05 | 0.066823 |
| 2195 | 3.437178e-05 | 0.066891 |
| 2196 | 3.444959e-05 | 0.066960 |
| 2197 | 3.444972e-05 | 0.066995 |
| 2198 | 3.470469e-05 | 0.067029 |
| 2199 | 3.473855e-05 | 0.067099 |
| 2200 | 3.478237e-05 | 0.067168 |
| 2201 | 3.501284e-05 | 0.067274 |
| 2202 | 3.549092e-05 | 0.067628 |
| 2203 | 3.575484e-05 | 0.067736 |
| 2204 | 3.580963e-05 | 0.067915 |
| 2205 | 3.585232e-05 | 0.067986 |
| 2206 | 3.588513e-05 | 0.068022 |
| 2207 | 3.591127e-05 | 0.068058 |
| 2208 | 3.605607e-05 | 0.068239 |
| 2209 | 3.607481e-05 | 0.068383 |
| 2210 | 3.608207e-05 | 0.068491 |
| 2211 | 3.653407e-05 | 0.068856 |
| 2212 | 3.677651e-05 | 0.068893 |
| 2213 | 3.694967e-05 | 0.068930 |
| 2214 | 3.710583e-05 | 0.069524 |
| 2215 | 3.713647e-05 | 0.069672 |
| 2216 | 3.715515e-05 | 0.069710 |
| 2217 | 3.745826e-05 | 0.069747 |
| 2218 | 3.748107e-05 | 0.069822 |
| 2219 | 3.758616e-05 | 0.069860 |
| 2220 | 3.765448e-05 | 0.069935 |
| 2221 | 3.767938e-05 | 0.069973 |
| 2222 | 3.767938e-05 | 0.070010 |
| 2223 | 3.808762e-05 | 0.070124 |
| 2224 | 3.827483e-05 | 0.070278 |
| 2225 | 3.842001e-05 | 0.070431 |
| 2226 | 3.846103e-05 | 0.070700 |
| 2227 | 3.854525e-05 | 0.070855 |
| 2228 | 3.854914e-05 | 0.070932 |
| 2229 | 3.857537e-05 | 0.071125 |
| 2230 | 3.857748e-05 | 0.071163 |
| 2231 | 3.879699e-05 | 0.071202 |
| 2232 | 3.879699e-05 | 0.071241 |
| 2233 | 3.883866e-05 | 0.071513 |
| 2234 | 3.896801e-05 | 0.071552 |
| 2235 | 3.902419e-05 | 0.071669 |
| 2236 | 3.909542e-05 | 0.071747 |
| 2237 | 3.922623e-05 | 0.071865 |
| 2238 | 3.928378e-05 | 0.071904 |
| 2239 | 3.944460e-05 | 0.072062 |
| 2240 | 3.954276e-05 | 0.072141 |
| 2241 | 3.957688e-05 | 0.072180 |
| 2242 | 3.962734e-05 | 0.072418 |
| 2243 | 3.964876e-05 | 0.072497 |
| 2244 | 3.973279e-05 | 0.072537 |
| 2245 | 4.008107e-05 | 0.072577 |
| 2246 | 4.008107e-05 | 0.072617 |
| 2247 | 4.012759e-05 | 0.072657 |
| 2248 | 4.021498e-05 | 0.072738 |
| 2249 | 4.024013e-05 | 0.072778 |
| 2250 | 4.043404e-05 | 0.072819 |
| 2251 | 4.082130e-05 | 0.072859 |
| 2252 | 4.138954e-05 | 0.072901 |
| 2253 | 4.148807e-05 | 0.072984 |
| 2254 | 4.172135e-05 | 0.073109 |
| 2255 | 4.172171e-05 | 0.073651 |
| 2256 | 4.195259e-05 | 0.073945 |
| 2257 | 4.231509e-05 | 0.073987 |
| 2258 | 4.245643e-05 | 0.074030 |
| 2259 | 4.265095e-05 | 0.074371 |
| 2260 | 4.268573e-05 | 0.074456 |
| 2261 | 4.286768e-05 | 0.074842 |
| 2262 | 4.287742e-05 | 0.074928 |
| 2263 | 4.308541e-05 | 0.074971 |
| 2264 | 4.347765e-05 | 0.075275 |
| 2265 | 4.352271e-05 | 0.075319 |
| 2266 | 4.366772e-05 | 0.075362 |
| 2267 | 4.407462e-05 | 0.075583 |
| 2268 | 4.425690e-05 | 0.075716 |
| 2269 | 4.433774e-05 | 0.075760 |
| 2270 | 4.452081e-05 | 0.075804 |
| 2271 | 4.473989e-05 | 0.075849 |
| 2272 | 4.496634e-05 | 0.075894 |
| 2273 | 4.498429e-05 | 0.076839 |
| 2274 | 4.498893e-05 | 0.077064 |
| 2275 | 4.524865e-05 | 0.077109 |
| 2276 | 4.568603e-05 | 0.077155 |
| 2277 | 4.573211e-05 | 0.077292 |
| 2278 | 4.580659e-05 | 0.077338 |
| 2279 | 4.598409e-05 | 0.077384 |
| 2280 | 4.626932e-05 | 0.077430 |
| 2281 | 4.656591e-05 | 0.077523 |
| 2282 | 4.673389e-05 | 0.077570 |
| 2283 | 4.675705e-05 | 0.077710 |
| 2284 | 4.700502e-05 | 0.077757 |
| 2285 | 4.709923e-05 | 0.077804 |
| 2286 | 4.727710e-05 | 0.077899 |
| 2287 | 4.737326e-05 | 0.077993 |
| 2288 | 4.756338e-05 | 0.078041 |
| 2289 | 4.764755e-05 | 0.078089 |
| 2290 | 4.766419e-05 | 0.078184 |
| 2291 | 4.816408e-05 | 0.078232 |
| 2292 | 4.817263e-05 | 0.078955 |
| 2293 | 4.943702e-05 | 0.079004 |
| 2294 | 5.009761e-05 | 0.079054 |
| 2295 | 5.039088e-05 | 0.079105 |
| 2296 | 5.060216e-05 | 0.079358 |
| 2297 | 5.074839e-05 | 0.079611 |
| 2298 | 5.098101e-05 | 0.079713 |
| 2299 | 5.154309e-05 | 0.079868 |
| 2300 | 5.225520e-05 | 0.079920 |
| 2301 | 5.236736e-05 | 0.080182 |
| 2302 | 5.298417e-05 | 0.080235 |
| 2303 | 5.327849e-05 | 0.080395 |
| 2304 | 5.361493e-05 | 0.080449 |
| 2305 | 5.395003e-05 | 0.080503 |
| 2306 | 5.401579e-05 | 0.080557 |
| 2307 | 5.462539e-05 | 0.080611 |
| 2308 | 5.478849e-05 | 0.080776 |
| 2309 | 5.517229e-05 | 0.080831 |
| 2310 | 5.544455e-05 | 0.081052 |
| 2311 | 5.592055e-05 | 0.081388 |
| 2312 | 5.652132e-05 | 0.081445 |
| 2313 | 5.683749e-05 | 0.081501 |
| 2314 | 5.758159e-05 | 0.081559 |
| 2315 | 5.785939e-05 | 0.081848 |
| 2316 | 5.803365e-05 | 0.082196 |
| 2317 | 5.832757e-05 | 0.082255 |
| 2318 | 5.903224e-05 | 0.082314 |
| 2319 | 5.906596e-05 | 0.082668 |
| 2320 | 5.920037e-05 | 0.082727 |
| 2321 | 6.014032e-05 | 0.082908 |
| 2322 | 6.034979e-05 | 0.083029 |
| 2323 | 6.056664e-05 | 0.083089 |
| 2324 | 6.117209e-05 | 0.083150 |
| 2325 | 6.151504e-05 | 0.083273 |
| 2326 | 6.253420e-05 | 0.083523 |
| 2327 | 6.337119e-05 | 0.083587 |
| 2328 | 6.344293e-05 | 0.083650 |
| 2329 | 6.367993e-05 | 0.083714 |
| 2330 | 6.391003e-05 | 0.083778 |
| 2331 | 6.475884e-05 | 0.083907 |
| 2332 | 6.732828e-05 | 0.083975 |
| 2333 | 6.913798e-05 | 0.084044 |
| 2334 | 7.021938e-05 | 0.084114 |
| 2335 | 7.265896e-05 | 0.084187 |
| 2336 | 7.267947e-05 | 0.084332 |
| 2337 | 7.278281e-05 | 0.084405 |
| 2338 | 7.427886e-05 | 0.084479 |
| 2339 | 7.530896e-05 | 0.084554 |
| 2340 | 7.531655e-05 | 0.084630 |
| 2341 | 7.533146e-05 | 0.085006 |
| 2342 | 7.536602e-05 | 0.085157 |
| 2343 | 7.537662e-05 | 0.085233 |
| 2344 | 7.615502e-05 | 0.085309 |
| 2345 | 7.793903e-05 | 0.086088 |
| 2346 | 7.908998e-05 | 0.086167 |
| 2347 | 7.931776e-05 | 0.086246 |
| 2348 | 7.983934e-05 | 0.086326 |
| 2349 | 8.233217e-05 | 0.086409 |
| 2350 | 8.343726e-05 | 0.086492 |
| 2351 | 8.748011e-05 | 0.086580 |
| 2352 | 8.752842e-05 | 0.086755 |
| 2353 | 9.012675e-05 | 0.086845 |
| 2354 | 9.143487e-05 | 0.086936 |
| 2355 | 9.535782e-05 | 0.087127 |
| 2356 | 9.720624e-05 | 0.087224 |
| 2357 | 9.727234e-05 | 0.087613 |
| 2358 | 9.752925e-05 | 0.087711 |
| 2359 | 9.933940e-05 | 0.088009 |
| 2360 | 1.053070e-04 | 0.088114 |
| 2361 | 1.073880e-04 | 0.088329 |
| 2362 | 1.094573e-04 | 0.088438 |
| 2363 | 1.326159e-04 | 0.088571 |
| 2364 | 1.468589e-04 | 0.088718 |
| 2365 | 1.537298e-04 | 0.088871 |
| 2366 | 1.630605e-04 | 0.089361 |
| 2367 | 1.723972e-04 | 0.089533 |
| 2368 | 1.998911e-04 | 0.090133 |
| 2369 | 2.130400e-04 | 0.090346 |
| 2370 | 2.279003e-04 | 0.090574 |
| 2371 | 2.409587e-04 | 0.090815 |
| 2372 | 2.462685e-04 | 0.091061 |
| 2373 | 2.653454e-04 | 0.091326 |
| 2374 | 3.238923e-04 | 0.091974 |
| 2375 | 3.434514e-04 | 0.092661 |
| 2376 | 3.577433e-04 | 0.093376 |
| 2377 | 3.757619e-04 | 0.093752 |
| 2378 | 4.651758e-04 | 0.094683 |
| 2379 | 4.711526e-04 | 0.095154 |
| 2380 | 5.216794e-04 | 0.095675 |
| 2381 | 7.109941e-04 | 0.096386 |
| 2382 | 2.252372e-03 | 0.098639 |
| 2383 | 2.762599e-03 | 0.106927 |
| 2384 | 3.162523e-03 | 0.113252 |
| 2385 | 3.495366e-03 | 0.116747 |
| 2386 | 8.461703e-03 | 0.125209 |
| 2387 | 1.179117e-02 | 0.137000 |
| 2388 | 1.516888e-02 | 0.152169 |
fig, ax = plt.subplots(figsize=(10, 5))
ax.plot(ccp_alphas[:-1], impurities[:-1], marker="o", drawstyle="steps-post")
ax.set_xlabel("effective alpha")
ax.set_ylabel("total impurity of leaves")
ax.set_title("Total Impurity vs effective alpha for training set")
plt.show()
clfs = []
for ccp_alpha in ccp_alphas:
clf = DecisionTreeClassifier(
random_state=1, ccp_alpha=0.003, class_weight={0: 0.15, 1: 0.85}
)
clf.fit(x_train2, y_train)
clfs.append(clf)
print(
"Number of nodes in the last tree is: {} with ccp_alpha: {}".format(
clfs[-1].tree_.node_count, ccp_alphas[-1]
)
)
Number of nodes in the last tree is: 13 with ccp_alpha: 0.015168881899002895
recall_train = []
for clf in clfs:
pred_train = clf.predict(x_train2)
values_train = recall_score(y_train, pred_train)
recall_train.append(values_train)
recall_test = []
for clf in clfs:
pred_test = clf.predict(X_test2)
values_test = recall_score(y_test, pred_test)
recall_test.append(values_test)
index_best_model = np.argmax(recall_test)
best_model = clfs[index_best_model]
print(best_model)
DecisionTreeClassifier(ccp_alpha=0.003, class_weight={0: 0.15, 1: 0.85},
random_state=1)
best_model.fit(x_train2, y_train)
DecisionTreeClassifier(ccp_alpha=0.003, class_weight={0: 0.15, 1: 0.85},
random_state=1)
# Checking model performance
confusion_matrix_sklearn(best_model, x_train2, y_train)
decision_tree_perf_train = model_performance_classification_sklearn(
best_model, x_train2, y_train
)
decision_tree_perf_train
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.7716 | 0.998731 | 0.743679 | 0.852538 |
# Checking on Test Set
confusion_matrix_sklearn(best_model, X_test2, y_test)
decision_tree_perf_test = model_performance_classification_sklearn(
best_model, X_test2, y_test
)
decision_tree_perf_test
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.767478 | 0.998569 | 0.738992 | 0.849391 |
# Visualizing the latest Tree
plt.figure(figsize=(15, 20))
out = tree.plot_tree(
best_model,
feature_names=feature_names,
filled=True,
fontsize=9,
node_ids=False,
class_names=None,
)
for o in out:
arrow = o.arrow_patch
if arrow is not None:
arrow.set_edgecolor("black")
arrow.set_linewidth(1)
plt.show()
# importance of features in the tree building ( The importance of a feature is computed as the
# (normalized) total reduction of the 'criterion' brought by that feature. It is also known as the Gini importance )
print(
pd.DataFrame(
best_model.feature_importances_, columns=["Imp"], index=x_train2.columns
).sort_values(by="Imp", ascending=False)
)
Imp lead_time 0.335282 avg_price_per_room 0.260623 no_of_special_requests 0.210231 market_segment_type_Online 0.116605 arrival_month_December 0.077259 no_of_adults 0.000000 arrival_month_May 0.000000 arrival_month_January 0.000000 arrival_month_July 0.000000 arrival_month_June 0.000000 arrival_month_March 0.000000 arrival_month_September 0.000000 arrival_month_November 0.000000 arrival_month_October 0.000000 market_segment_type_Complementary 0.000000 market_segment_type_Corporate 0.000000 market_segment_type_Offline 0.000000 arrival_month_February 0.000000 room_type_reserved_Room_Type 7 0.000000 arrival_month_August 0.000000 no_of_weekend_nights 0.000000 room_type_reserved_Room_Type 6 0.000000 room_type_reserved_Room_Type 5 0.000000 room_type_reserved_Room_Type 4 0.000000 room_type_reserved_Room_Type 3 0.000000 room_type_reserved_Room_Type 2 0.000000 required_car_parking_space_Yes 0.000000 type_of_meal_plan_Not Selected 0.000000 type_of_meal_plan_Meal Plan 3 0.000000 type_of_meal_plan_Meal Plan 2 0.000000 arrival_date 0.000000 arrival_year 0.000000 no_of_week_nights 0.000000 repeated_guest_Yes 0.000000
importances = best_model.feature_importances_
indices = np.argsort(importances)
plt.figure(figsize=(12, 12))
plt.title("Feature Importances")
plt.barh(range(len(indices)), importances[indices], color="violet", align="center")
plt.yticks(range(len(indices)), [feature_names[i] for i in indices])
plt.xlabel("Relative Importance")
plt.show()
What profitable policies for cancellations and refunds can the hotel adopt?
What other recommedations would you suggest to the hotel?